I have a Weston analog thermometer that was reading frozen distilled water at slightly under 31F. As I want to start doing C-41 processing, I decided this wasn't good enough. Last night I purchased a Taylor digital thermometer, ran the same test: 30.9F! Before I buy a third, probably more expensive, thermometer, can someone comment on this method of calibration? I was prepared to give or take half a degree, but I find it curious that both thermometers are this far off. Is there a physics lesson that I'm missing? Thanks. ETA: Both are stem/probe type thermometers.