I’m not sure there is a whole lot you can do unless you spend relatively big money on a calibrated/certified thermometer. I did that years ago because I needed (wanted is probably more accurate) one for non-photographic purposes, and I use that as a reference for the cheaper thermometers I actually use for photo stuff (several Patersons). They are all +/- a few 10ths of a degree C and that is plenty good for black and white photography. Color processing is more sensitive but I don’t know what level of accuracy is needed.
Apart from the technical points in post #38 above, one thing I learned from a career in biology was that temperature is a momentary measure reflecting the dynamic relationship between the thermometer and its environment, unless that environment is extraordinarily stable. Two theoretically identical thermometers (if they can be found) placed in the same bath, may never read exactly the same. Having said that, any one glass thermometer will do the same thing in the same circumstances, every time. How precisely do you want to measure temperature, and how accurate must that be? Certification presumably relates to accuracy. But surely the important aspect of the Kodak recommendation is precision (i.e. cut out variation for consistent results), rather than accuracy relative to the freezing point of krypton or something.Are there any current offerings in the way of thermometers that can be trusted fully?
Apologies. You're right. I deleted the offending paragraph.Correct me if I'm wrong but if they are supposed to be accurate to +/- .12C, then the max difference between the two thermometers should be .24C, no?
I have been thinking about Kodak and Paterson.
Kodak being American would most likely use the Fahrenheit scale, whilst Paterson being European, I think, would use the Celsius scale.
With the help of an online converter I found that each step up of 1 C would be 1.8 F on the Fahrenheit thermometer scales.
So would that make the Fahrenheit scale a more accurate scale to use?
Also a given percentage accuracy e.g. 1% would be completely different in practice between each scale.
Can some one tell me if my reasoning is correct please?
I have been thinking about Kodak and Paterson.
Kodak being American would most likely use the Fahrenheit scale, whilst Paterson being European, I think, would use the Celsius scale.
With the help of an online converter I found that each step up of 1 C would be 1.8 F on the Fahrenheit thermometer scales.
So would that make the Fahrenheit scale a more accurate scale to use?
Also a given percentage accuracy e.g. 1% would be completely different in practice between each scale.
Can some one tell me if my reasoning is correct please?
although many thermometers are not actually rated to withstand 100C.Standard practice for calibrating a liquid thermometer is to measure the reported temperature in an ice bath and a beaker of boiling water.
You can always use the ice point and boiling point to check the thermometer for example the Kodak Process Thermometer Type 3 only reads from 50 to 140F so it can't read either the ice point or boiling point.
I expect you meant to say: "You can't always use the ice point and boiling point to check the thermometer. For example, the Kodak Process Thermometer Type 3 only reads from 50 to 140F so it can't read either the ice point or boiling point.
yes you're right. I meant to say you can't. If you subject such a thermometer to 212F I am quite sure it would be damaged.
The scale doesn't matter.
Perhaps I did not explain myself very well.On a thermometer like this, the Celsius markings are farther apart, and there is more room to add markings for partial degrees Celsius.
So there is no difference.
Perhaps I did not explain myself very well.
Assuming a rise of 1 degree C is the same as a rise of 1.8 degrees F.
To find 1% of 1 degree C , divide by 100. Likewise, to find 1% of 1.8 degrees F , divide by 100.
This equals 0.01 of a degree C and 0.018 of a degree F respectively.
Both figures are the same temperature shift.
But possibly a non mathematical person such as myself, looking at these figures or ratios, with reference to a percentage accuracy figure would think the Centigrade figure was more accurate.
Sorry, the thought is in my head, but trying to get it over is one over those you see it or you don't.
Hopefully some one else is on the same wavelength as me.
Thanks for replying.
Hot tap water is usually about 50°C/120°F. If your thermometer's range is less than that, you could risk breaking the thermometer by leaving it in too-hot water for a long time (the internal pressure can crack the tube or top of the glass column). However, if your thermometer is intact and the fluid column is not separated, no damage has been done and the thermometer is likely as accurate as it always was.
Do check for cracks and fluid separation. The latter problem can be remedied by heating the thermometer slowly till the fluid reaches the very top of the column and rejoins, then remove it quickly from the heat source and let it cool slowly.
If you're still in doubt, find a reliable thermometer to check against.
Doremus
Would it make sense to measure the outside temperature and compare the result to the actual report of the local weather station?
may or may not be useful depending on how close you live to your local airport. and what the Terain is like between "here" and "There"Would it make sense to measure the outside temperature and compare the result to the actual report of the local weather station?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?