mshchem
Subscriber
Every time someone reports a finding without a understanding of the uncertainty. It loses credibility.Well, both are terms out of metrology, I consider rather puzzling in this context, out of themselves and due to varying terms in different languages.
Precision means the range of a values yielded after repeated measuring of the same state.
Accuracy means the deviation of the median of the range of precision.
See also this graphic:
https://en.wikipedia.org/wiki/Accuracy_and_precision#/media/File:Accuracy_and_precision.svg
Thus accuracy is NOT the greatest error...
Now you likely will understand why I do not consider these terms helpful here.
When I went into manufacturing in the US after working in a lab. I was stunned. One thing was the nightmare of English measures, BTUS etc versus SI. Secondly no error determination associated with a statement. People would compare 0.124 vs 0.120 and thought they had made some breakthrough . I would say they were both 0.1 It would drive these guys crazy. I think it's directly related to the development of electronic displays. People think the more digits they report the better the result.