• Welcome to Photrio!
    Registration is fast and free. Join today to unlock search, see fewer ads, and access all forum features.
    Click here to sign up

C or F

Forum statistics

Threads
203,442
Messages
2,854,766
Members
101,845
Latest member
azak
Recent bookmarks
0
Exactly
 
The problem with the example is that the resolution is much smaller than the accuracy. So the resolution doesn't impede the accuracy of the instrument, it just gives the user a false sense that the accuracy might be better than it actually is.

What I've been trying to describe is the reverse - where the instrument is more accurate than the resolution. In that case, the resolution becomes the limiting factor, so for practical purposes the accuracy is reduced.

Think of the gauges in a car. Some cars have detailed indicators of oil pressure and temperature, while other just have warning lights that come on when there is a problem. In both cases, the oil pressure and temperature sensing systems may be quite accurate at measuring what they do, but the gauges communicate the information with greater accuracy than the warning lights do.

An example can be found in what we started with - a thermometer that reads in whole degrees only (and is digital and therefore doesn't permit you to interpolate between the digits). If you switch scales from Fahrenheit to Celsius, the accuracy of the reading goes down.

If you are talking about analog thermometers, they tend to include aids to interpolation on both scales which tend to yield close to the same absolute resolution, so it usually doesn't matter.
 
All the thermometers I use are scientific so marked in Centigrade, as I've used metric measurements since at school it's more natural. These days I mainly use a digital thermometer which enables me to keep my chemistry around +/- 0.2ºC of the chosen temperature.

Ian
 
I don't think I could get a Fahrenheit thermometer here if I wanted to so Celsius is more accurate, more precise and more possible.

Sent from my A1-840 using Tapatalk
 
who out of all of you who have mercury or alcohol based thermometers, have tested them against scientifically calibrated thermometers. I'd guess none of you so you're all talking out of the wrong orrifice. None of these thermometers are accurate to the level that would be required for them being marked in F or C to make the slightest difference to accuracy. Only if you have expensive certified thermometers would they be accurate to that level. And as I already pointed out, they don't need to be accurate, they need to be consistent to your chosen mark on the scale whatever that is.

This is a pointless discussion.
 
"Richtigkeit und Präzision" = accuracy and precision in English; no real difference.

Is a thermometer marked in 1/2-degree C increments more accurate than one in whole-degree F? How much accuracy does one need anyway for film processing?

An acceptable margin of error is the only thing needed for film processing. If your thermometer can deliver that, you are fine. (but "only to a degree" :whistling:)

FWIW, I'd much rather have an "accurate" thermometer than a "precise" one that is a couple of degrees off. Careful workers keep a reference thermometer and calibrate to it periodically. I have a calibrated Kodak process thermometer for this that I keep well-stored and use only for periodic calibration (BTW, these are not that hard to find used and are quite accurate). I have a couple dial thermometers with labels on that that read, e.g., "71° = 68°" just so I know where to set the temp. Since that agrees with the reference, I don't really care about the actual "reading."

Doremus
 
I agree with RobC and Doremus on their points. Back in the day when I thought having an expensive thermometer would make me a better photographer (still waiting for that BTW), I bought a relatively high-precision NIST-traceable one for calibrating my Jobo bath (I know someone will ask -- must set 23.7C to get 24.0C) and another glass thermometer. The point is that you are calibrating your entire system. Your thermometer error, light-meter error, shutter speed errors, etc., if consistent, combine to produce the final product (e.g. print) you want. So, pick a thermometer of convenient length and resolution (i.e. not one you can hang on your jacket zipper), and use that one.
Mike
 
Last edited by a moderator:
Accuracy is a measure of how close the instrument can come to correctly reflecting a true reference value. Precision is related to how small of an increment of measure can be reproduced reliably.

Given accurate calibration in both thermometers, your example is a demonstration of the greater precision of the F scale. In other words, both instruments will give you the same result accuracy, but the F instrument will slice that result with a finer precision than the C instrument.

Ken

That's true but the C scale or F Scale can be further divided into 10th, 100th etc... so really either F or C is not relevant.
 
Accuracy is a measure of how close the instrument can come to correctly reflecting a true reference value. Precision is related to how small of an increment of measure can be reproduced reliably.

Given accurate calibration in both thermometers, your example is a demonstration of the greater precision of the F scale. In other words, both instruments will give you the same result accuracy, but the F instrument will slice that result with a finer precision than the C instrument.

Ken

On my C/F thermometer, I can read F to a precision of 2 degree. I can read Celsius to a precision of 1 C, which is equivalent to 0.9F.

See pic. F is on the left, C on the right.

ImageUploadedByTapatalk1449413782.718454.jpg

Precision is a function of the instrument, not the scale.
 
The accuracy of a thermometer is based on the length of the mercury column and the diameter of the column. Since the typical blank could be engraved with either a centigrade or a Fahrenheit scale it really makes no difference.

That's true but the C scale or F Scale can be further divided into 10th, 100th etc... so really either F or C is not relevant.

Precision is a function of the instrument, not the scale.

You guys are right, and I'm wrong.

The scale printed on the instrument has no effect on precision. It allows one to gauge precision, but it doesn't contribute to it. One could even wipe some paint remover and remove the scale markings entirely, and the blank instrument's precision (repeatability) would remain unchanged.

On that graph I posted earlier, mentally overlaying an 'F' or a 'C' scale on the x-axis has no effect whatsoever on the existence or width of the precision value plot.

I still believe my base definitions were correct. But my illustration of them was incorrect.

Ken
 
Last edited by a moderator:
develop at -40 then you can't go wrong.
 
As I think about this, it occurs to me that the subject is almost appropriate for the "Philosophy" portion of the "Ethics and Philosophy" sub-forum.

If you consider the scale or the readout or whatever you use as an interface between the instrument and the user to be an integral part of the instrument, and therefore its accuracy/precision, then the nature of that interface is important to the issue.
 
Only to a degree.

:D

As someone else explained, repeatablity is what we need. That is certainly where the old dial thermometers faltered. Handy things, but easily fell out of calibration in the hands of students. I am using a digital thermometer right now, mixing water from separate hot and cold taps to the desired temps. It is pretty handy and I can judge how close I got the mix to temperature by the speed of the changing numbers on the readout.
 
what are you sticking your fingers into?:smile:

Speaking of which, how relevant is your measured temperature - to the processing that you are doing?

When tray processing, for example, I can put the probe directly in the tray and it will measure the temperature of the same bath that is developing the film.

But when processing film in tanks, I have a bucket of water that acts like a water jacket (sort of). I put the beaker in the bucket before processing. During processing I put the tank in the bucket. I position the probe on the incoming water stream, or I might position it in the bucket - but I can't put the probe in the tank.

For example that same probe I linked to (with educational accuracy and precision), reads the following:

69.12 F developer in beaker
70.70 F water in bucket
71.04 F water stream pouring into bucket

And an uncalibrated dial thermometer on the filter stage reads about 73 degrees F. Water is flowing slow enough and ambient temperature is cold enough to explain the different readings between faucet and spigot. And the bucket/beaker differences are also reasonable.
 
Last edited by a moderator:
F, because it's familiar. C, because 20's a nice round number.
 
As long as I'm not trying to mix developer at 68 degrees using a centigrade thermometer I really don't care.
 
I suspec that the most innacurate part of most thermometer scales is the registration of the thermometer to the printer when screen printing the scale on.


Steve.
 
As I think about this, it occurs to me that the subject is almost appropriate for the "Philosophy" portion of the "Ethics and Philosophy" sub-forum.

If you consider the scale or the readout or whatever you use as an interface between the instrument and the user to be an integral part of the instrument, and therefore its accuracy/precision, then the nature of that interface is important to the issue.

Exactly, but if I had put it there a mod would have moved it.
 
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom