Just to provide some context to what I'm saying about thermometers, a few months back I went and tested all of the thermometers I have against my gold standards. I have two glass thermometers, one a NIST-certified alcohol model, and a mercury thermometer which is not NIST-certified because NIST won't certify mercury thermometers anymore, but is certified via an alternate authority and checked against the alcohol model. They match the two digital NIST-certified thermometers to within 0.2C. So even among these 4 extremely accurate thermometers, there is still some dispute and I get about 0.5C of range between them. Both glass thermometers have marks that allow 0.1C measurements, which means they're about half a meter long to allow 0-50C at that resolution. Each of the glass thermometers I have cost about $100. I went through and check the collection of about 35 digital and analog thermometers I have, including process thermometers from Kodak, Ilford and Paterson. Of those 35 thermometers, exactly five of them were accurate to under a degree, and only two of them were accurate to under 0.5C. So I suppose you could randomly end up with some accurate thermometers, but the odds are against you, and unless you have a certified accurate thermometer to compare it against, you'll never actually know how accurate the devices you own are.
To compound that, taking measurements from my Jobo in various places, I was getting 2C variations based on _where_ I placed the thermometer. Even in inside the developer bottle there is a far bit of variation from the bottom to the top of the bottle. The water outlet from the pump is generally a very different temperature from the water in and around the developer bottle. The actual Jobo temperature sensor is inside the pump housing, and that reading is hotter by a little more than a degree than the reading on the front of the process, as someone noted earlier in this thread. Add to that the fluid dynamics inside the spinning developer bottle, and the task of ensuring temperature uniformity just becomes insanely complex.
This is why I advocate test strips. Densitometry is several degrees easier and cheaper than getting accurate temperature readings, which is why test strips are the standard process control for professional labs. With test strips, it doesn't actually matter how accurate your thermometers are, as long as you make the measurements in the same place and the measurements are repeatable. Glass thermometers are better for this than digital, since even the best digital thermometers can drift, but the key is to find a temperature as measured on whatever thermometer you own that produces consistent test strips.