What's new
What's new

Long-term accuracy

The real Leigh

Diamond
Joined
Nov 23, 2005
Location
Maryland
I just decided to check my tenth-reading 1" micrometer to see how close it was. I have an in-cal Mitutoyo gage block set designed specifically for checking mics, so I'm pretty confident the results are correct.

It reads .00015" high on a 1" gage block. (Yes, .00015", not .0015".)

Not bad for a mic I bought used 45 years ago. ;)

- Leigh
 
Hi Stephen,

I didn't check all the variables. I could if I wanted to.

The cal set includes an optical flat for checking the faces (they're carbide), but it works best in incandescent light which I don't have here in the office.

Temperature stability isn't an issue since everything is in an air-conditioned office, and has been for a long time.

I'll investigate further when I get motivated. :smoking:

- Leigh
 
Maybe a little bit of knowing the master's size in advance? (Operator's bias)

I know you understand gage calib. but this a bit too good.

I've run at least 50 gage studies on manual mics and I've never seen anyone repeat within two tenths on a blind gage run.

Every time someone tells me they can read a mic to a tenth I have to sit them down and prove them wrong (many, many bets, I've never lost) .
Lap a gauge block 2 tenths under and see what size people say it is. :D Oversize masters are even better as people will "make" them read the stamped size.
Years ago I thought I could do this, now I know better.

On the other hand, unless abused or worn faces, why would it go bad, I've got 30 year old mics (starret and mits) that still work as well as they ever did.
Bob
 
The man with two sets of mics never knows just what size anything is. :codger:

I never thought much about that gage bias thing, but if measuring a 1.00000" gage block I'd certainly be reluctant to report anything different!
 
Maybe a little bit of knowing the master's size in advance? (Operator's bias)
I know you understand gage calib. but this a bit too good.
I've run at least 50 gage studies on manual mics and I've never seen anyone repeat within two tenths on a blind gage run.
Hi Bob,

When I check cal I don't know what the master size is. I just apply the mic and note the position of the zero point on the thimble relative to the scale calibrations.

As I mentioned, this is a tenth-reading mic, with vernier lines the length of the shaft. And I do know how to read a vernier.

If I can't repeat to a tenth when reading the same gage block ten times, I'll find another job. Stop by sometime and I'll show you how it's done. ;)

I just ran the other blocks in the set, and could find no more than .0001" deviation on any of them.

The condition of the mic is excellent. I take very good care of my tools. :smoking:

- Leigh
 
Leigh

How many readings?
Just one or enough for a sensible statistical average?
By hand feel or by ratchet?

Actually mean 0.00015 error is about what you should expect from an inspector type using a tenths vernier micrometer over a reasonable statistical sample say 50+ readings. Normal tendency is to be low though, primarily due to too much tension in the ratchet and attempting to try to read too accurately. H.G.Conway gives an illustrative example in Engineering Tolerances (Pitman 1966, 3rd Ed) where the results from users of vernier and plain micrometers measuring a pin of 0.37675" diameter with both vernier and plain micrometers. For the vernier the maximum errors were + 0.00025 and - 0.00045 with an arithmetic average of - 0.00016. For the plain micrometer the final size had to be by estimation and the maximum errors were ± 0.00025 with an arithmetic average of - 0.000085. Unsurprising when you think about it as no half decent user is likely to estimate out by more than a quarter of a division on the thimble. Leaving aside considerations of whether or not arithmetic mean is a sensible statistical accuracy parameter (it isn't really but better ones take a lot of explaining) Conway concludes that the expected accuracy of a micrometer reading is ± 0.0002 for a skilled inspector so the micrometer is useless for measuring parts whose tolerance is less than 0.002". A vernier micrometer makes a real good comparator, better than a tenth if you really roll your sleeves up and concentrate, but its no better than a normal micrometer for measuring.

Reliable measurement gets real difficult once you get below a couple of thou. Reliable comparison and repeatability start getting hard around a tenth or so. Easy to forget the difference.

Much of my working life was spent building, developing and using uber specialist test gear for optical R&D to nail down what some mad scientists dream spread over a few square yards of optical table could really do in practice. I got real tired explaining the difference between comparison and measurement and the importance of repeatability. Especially the time I spent 6 months in a dark lab manually taking about 60,000 readings (pre computer days) to nail down what were the best methods of telescope magnification measurement for multi-spectral image processing devices. At least they'd invented the HP67 calculator so processing the data was less of a chore than with a turn the handle type. The boss might have carped about what the HP67 cost but it probably saved his life! Then some smart so and so invented digital image processing which made it all a major waste of time.

Still one nice thing about measurement in the optical world is that its easy to set up an interferometer to generate an accurate yardstick.

Clive

P.S. Conways book is a good read if your understanding of the real basis of tolerances has gotten a bit shaky.
 
Last edited:
The question arises...does it read +.00015 when it is supposed to read 0.0000? Perhaps if you close till there is a drag on white paper to clean the anvil and spindle, then 0.0000 and your 1.0000 might be 1.0000.
 
Hi Guys,
Here I go with my first post. I worked for years in a huge cold forming and machining facility. Cold forming is basically a play-dough machine for steel or other metals. They require some very tight tolerances in the dies to get any kind of tool life. I spent the last several years of my tool and die career grinding these dies as well as most anything else that came my way. Most of the tolerances on these dies and punches were +- .0002. They also needed to have mirror finishes on mostly carbide and HSS tooling. If you work in these tolerances day in and day out, it becomes a lot easier than if you normally work in the +- .001 or .005 range. When you check everything with an electrolimit gage and compare it to your mics. you get a feel after a while.

Most people would probably be doing well to read a mic within .0001. But when you are measuring the same or very similar parts day in and day out you should have no problem with measuring within a tenth with a good quality carbide tipped tenth mic with a friction thimble that has been properly setup.
 
Ok, now say you do your test at 30degC. (underscores separate thousands). Assume for a minute, your mic is made of the same steel as the gage blocks.
Your 1.000_000 gage block now actually is about 1.000_120 (+/- about 0.000_010 depending on alloy). But your micrometer, being made of steel, may also have stretched to 1.000_120 when it reads 1.000_0. So, your block measures 1.000_0. Which is sometimes a blessing and sometimes a curse. If you wanted to know what the steel's nominal dimensions where at 20degC, you got the right answer. You ship the part to the customer who measures at 10degC and the same thing happens in reverse. If you needed to know what size it is right now, you got the wrong answer. If you measure a 1.000_000 piece of zerodur, your micrometer read 0.999_9 which is neither the correct dimension now or at 20degC. If you measured a piece of aluminum, you wouldn't get the correct answer now or at 20degC. If you measured stainless, it expands at a higher rate than other steels. If the mic is made of stainless, it also would so regular steels are off. Body temperature on a piece of steel could throw it off by 2 tenths, if you hold onto it long enough, and twice that for aluminum. Which is why mics frequently have two little pieces of plastic insulation where you are supposed to hold them. And these numbers get bigger on a long piece.

So, it is easy to fool yourself that your measurements are more accurate than they really are for this reason, in addition to those stated in other posts. Like confirmation bias. If you take three measurements and get 0.999_8, 0.999_9, and 1.000_0. Well, must be 1.000_0 and I wasn't holding it right the first two times. Even when you aren't trying to cheat, it is still easy to cheat. Particularly when what you are measuring looks spiffy. I like the bogus gage block trick.

All metrology 101.

How does hardness (heat treating) of an alloy affect the coefficient of thermal expansion? I got the impression from a badly written material datasheet that it could have a pretty substantial effect.
And since it is often hard to measure under controlled temperature conditions, this matters - one may need to calculate the errors or a correction. Particularly since many of the objects we measure and measurements themselves are often hardened.

Now, when I was doing work at the 0.5nm (~0.02microinches) resolution level, we needed to let a thick piece of glass stabilize for at least 24 hours or we could see the effect of it not being in thermal equilibrium (actually, we could still see it, it just wasn't as ginormous). I had to build a temperature stabilizer for that one (which is rough when putting a glass cover over it messes things up optically), plus we calibrated a lot. It operated in an environment (telescope dome) where temperature varied a lot. Plus we had cyrogens. Another fun one was measuring wind speeds on another planet, when both planets are rotating and hurtling through space. Even, there, people tended to overlook the impact of temperature.
On one occasion, we had a piece of equipment failing intermittently. I am the only one who noticed that the failures were correlated with temperature. Took some convincing. Took even more convincing to get permission to cook the million dollar instrument (not ours) with a heat gun to keep it working. Did the job. Turned out a wire had broken (one half of a differential pair) and I was shifting the logic thresholds just enough for it to work. Experiences like this, though, mean I like to design temperature sensors into things and record temperature.
 








 
Back
Top