What's new
What's new

Help me understand accuracy vs resolution.

Atomkinder

Titanium
Joined
May 8, 2012
Location
Mid-Iowa, USA
Perusing fancy electronic height gages I came upon one for sale (mostly wishful thinking) and did some googling on it and came up with the product page. On said page the accuracy of the height gage in question is .00079" (20 um), but the resolution is .00005" with a repeatability of .00008". Now I can wrap my head around a resolution slightly finer than the repeatability, but can't understand how the accuracy can be nearly sixteen times the resolution!

Anyone have ideas here? How can this tool be trusted to anywhere near its claim if this is the case, or am I missing something?
 
Happens all the time (accuracy being less than the resolution).

Typical height gage has an accuracy around .001" (about the same as your .00079"). Doesn't matter a whole lot if it's a vernier, dial, or lower cost (think caliper scale) digital. You can do a lot better than that in an electronic height gage; but at a $$$ cost.
 
Resolution can be whatever you want it to be... 10 digits... 20 digits... 100 digits... whatever the hardware can handle.
This is controlled by the electromechanical system that converts real measurements into electronic bits for the computer to munch.

Accuracy is the key specification for any measuring instrument. It usually is worse than the resolution.

The idea is that if you need to make small adjustments centered on a particular value, you can always
set an artificial zero at that value using gage blocks or other highly accurate reference.

After that you can measure small distances either side of that reference to whatever increment the electronics will do.

- Leigh
 
Think about it like this:

You have a lathe with a worn-out lead screw on the cross slide. The dial has gradations for every .001 inch. That's the resolution. However, you know that if you crank it from full back (where the threads are more or less pristine) to full forward (where the threads are razor edges at best) you'll lose .02...

Your accuracy is only .02, or worse, despite the dial reading in units of .001...

(ignoring compensation in your head, or in a controller, and also ignoring interpolation)
 
You guys are all answering the opposite of what was asked.

He said the accuracy was 16 times better than the resolution. Not the reverse.
 
Or think about it like this:

You don't know much about machines, but you're on an unmentionable hobby forum. You buy a stepper motor that gives you 1024 positions per rotation. You buy some threaded rod that is 24 teeth per inch at the hardware store. You attach all this to some MDF, some rollerblade bearings and a dremel.

Your resolution is .0000407 inch per position of your stepper motor. Your accuracy... won't be.
 
You guys are all answering the opposite of what was asked.

He said the accuracy was 16 times better than the resolution. Not the reverse.

Just his wording that has you confused (that accuracy was 16 times the resolution) -- the OP also provided the numbers. Pretty sure we're all correctly responding to the numbers he provided for accuracy and resolution. The OP didn't say it was "better" and what his numbers say is roughly sixteen times "bigger" (aka worse).
 
This particular electronic height gage, while not the peak of quality, is one that is recognized. It is not a caliper-style digital, and originally was over $4k new. I suppose what I am really curious about is how it could have been sold as such, with such a LOWER accuracy than its resolution! I understand that my Mitutoyo electronic micrometers are accurate to .0002" with a resolution of .00005" but this is small enough a difference that it is negligible in my mind and in practice given either a master or proper calibration.

You guys are all answering the opposite of what was asked.

He said the accuracy was 16 times better than the resolution. Not the reverse.

No, it is sixteen times the AMOUNT. It is less accurate by far than its display.

Edit: I fear I was unclear in my question. It is more about whether I have missed something than whether this is possible. I understand its possibility, just astounded at any purpose to BUY the damn thing if its display is virtually meaningless! Why spend >$4000 on a height gage that can only be trusted to .0005" when it gives you another order of magnitude in the display?
 
It is more about whether I have missed something than whether this is possible.
It's not only possible, it's normal. That's the way electronic equipment is designed.

It has absolutely nothing to do with the nature of the parameter being displayed.
Were it showing the temperature of a freezer or the speed of an airliner, the same engineering methods are followed.

Why spend >$4000 on a height gage that can only be trusted to .0005" when it gives you another order of magnitude in the display?
You're paying for the accuracy, not the resolution.

-----

Resolution comes into play when comparing two measurements that are very close but not exactly equal.

For example, measuring a .10000" gage block and a .10005" gage block, you should see that .00005" difference, even if
the values were shown as .09990" and .09995".

-----

Another point regarding electronics, is that any analog-to-digital conversion has an ambiguity of +/- 1 least-significant digit (LSD).
This is true of ALL A-D systems, regardless of the parameter being measured or the reason for the measurement.

That means if the smallest value in the system is .0001", all readings have an error of +/- .0001" on top of any accuracy spec.
Some manufacturers choose to avoid this by simply not displaying the LSD in the first place.

- Leigh
 
Example of high resolution but low accuracy: A wooden ruler with thick lines, labeled with decimal equivalents out to four or five decimal places ("This thing says it's 'zackly 1.8125 long").
 
Being that this a Trimos helps a little in describing this.

The accuracy they are quoting is the measurement from any two points in its entire measuring range. This is actually telling us the total error of the scale- worst case measurement. That's why it is significantly greater than its repeatability or resolution.

The resolution is just a measure of the smallest detectable increment on the scale. This one is 1 micron. Resolving 1 micron also means that we can only measure +-1micron which in effect is 2microns being the smallest meaningful measurement.

The 2 micron repeatability, is the variation that you would see if you measured something a large number of times. All of the measurements would fall within a 2 micron window. This is actually very good in this case since the repeatability equals the smallest meaningful measurement. Can't get better than that.

This Trimos you are looking at is actually quite good for what it is. The .0005 number is the worst case accuracy at full scale measurement. What is confusing about this one is that Trimos is being very straight forward in presenting the true accuracy and precision that this instrument can and will deliver. The other examples you are comparing too are not even in the same ball park as the Trimos. They just don't mention their true numbers in the same way that Trimos is.

The whole topic of precision and accuracy gets to be very complicated and especially if done in the realm of statistical analysis. Bottom line is that the average person is not measuring anything close to what he thinks he is.
 
At the risk of repeating what has already been writing I'll offer my "contribution".

Resolution is what is shown in the display and how many digits are shown after the decimal point.

Accuracy is what the dimension can be measured to with reasonable certainty.

Repeatability is how close the measurements are to each other when the dimension is measured several times.

Use a reliable gauge block (or several) and you'll soon see the difference between the three ;)

I suppose you know that what you read on your car speedometer probably isn't the speed you are driving at? Chances are that you can be driving up to 10% slower than shown on the "display". Never yet seen a speedometer that displayed a number that was slower than what it was being driven. Who wants a "slow" car. Don't use that if you get stopped by the police as you might just have a reasonably accurate speedometer :)
 
I have a bar that is exactly 25mm I measure it with a vernier which measures it as exactly 25mm then my vernier is said to be perfectly accurate. I now measure the same block with a micrometer which gives me 24.998mm. This means that the micrometer is not as accurate as the vernier.

I now heat up the bar and it expands by 0.002mm and measure again. The vernier still says 25mm because it's gradients cannot measure the 0.002mm but the micrometer now says 25.000mm. Now I deduce that the micrometer is more precise than the vernier even though it is still less accurate because the bar is actually 25.002mm.

So the two are very different. My measurement device may be totally inaccurate and out by a full meter but if it can detect subtle changes it is still precise.

Likewise my tool may be extremely accurate but only able to detect coarse differences since it is not precise.
 
I have a bar that is exactly 25mm I measure it with a vernier which measures it as exactly 25mm then my vernier is said to be perfectly accurate. I now measure the same block with a micrometer which gives me 24.998mm. This means that the micrometer is not as accurate as the vernier.

I disagree because the caliper only displays to 0.01mm but I'm assuming your micrometer reads to 0.001mm.

The example you give has nothing to do with accuracy but very much to do with capability ;) You're comparing coconuts with walnuts.
 
I have a bar that is exactly 25mm I measure it with a vernier which measures it as exactly 25mm then my vernier is said to be perfectly accurate. I now measure the same block with a micrometer which gives me 24.998mm. This means that the micrometer is not as accurate as the vernier.

I now heat up the bar and it expands by 0.002mm and measure again. The vernier still says 25mm because it's gradients cannot measure the 0.002mm but the micrometer now says 25.000mm. Now I deduce that the micrometer is more precise than the vernier even though it is still less accurate because the bar is actually 25.002mm.

So the two are very different. My measurement device may be totally inaccurate and out by a full meter but if it can detect subtle changes it is still precise.

Likewise my tool may be extremely accurate but only able to detect coarse differences since it is not precise.

I wish you'd figure out what your bar really is ;) I'm sure I know what you mean but it's (to me) a very confusing example. I stand by "You're comparing coconuts with walnuts."
 
What you have is a manufacturer actually somewhat truthful about accuracy in real use.
This is +/-.0004 on a 50 millionths gage.
My 50 millionths digital Mits mics routinely test out in the same range.
There is always a "spec war" in the gage world.
This number would normally be quashed by the marketing department.
Bob
 








 
Back
Top