- Nov 22, 2009
- Boston, MA area
Generally you need good accuracy and precision if you want to do good work.
Yeahbut, what about the official definition of a meter, when it was two scratches on a long polished metal bar? Its accuracy was 100%, by definition.
So what's the precision? Basically, the width of the scratches. But but but - what about measuring things less than a meter long, and the like? Maybe 100 subdivisions - we'll call it a centimeter. So, the precision is one centimeter?
Uhh, no. That's the resolution then. Huh? Everybody calls it precision.
Lost down rabbit hole.
So what to do? All these terms have been used, generally loosely, and their definitions have become blurred, even if Marketing wasn't involved.
Anyway, the definitions I prefer are that precision is basically the number of significant decimal digits, and accuracy is how close the measurement comes to what it claims to be. Accuracy is always relative to some standard, which can be local and not official. Or it could be the international standard meter (which is no longer defined by a scratched metal bar).