What's new
What's new

Differences in multiple micrometer measurements

awander

Stainless
Joined
Jun 11, 2012
Location
Eastern PA
I was doing some measuring today and I was surprised by the results.

I bought an older B & S 233 bench micrometer recently, and when I first got it, I adjusted it so that when I closed the clean anvil/spindle faces using the ratchet, it read "zero".

Then yesterday, I used it to measure a round piece(it was a master thread wire that I wanted to make sure was in the correct vial). The micrometer showed that the wire was about .001" smaller than it should have been. So I figured it was in the wrong vial. But just to double check, I pulled out a gage pin and measured it. The gage pin, too, measured smaller than it was marked, by nearly .001".

Next, I tried using a gage block, and the micrometer measurement agreed with the marked size of the block.

My conclusion is that with the same amount of pressure(using the ratchet for all measurements) the micrometer will measure an object as larger or smaller depending on how much area on the measurement faces the object is in contact with. Does this seem correct to those of you who know about such stuff?

On a related note, I also found that I could get very different readings depending on how fast I closed the micrometer-it seemed that if I spun it a little faster using the ratchet, rotational inertia of the spindle/thimble would cause it to close down a bit more, measuring between .0004 and .0008 smaller on the same gage block. Even when I was careful to close micrometer on the block very slowly, I couldn't get consistent readings closer than .0002 or so.
 
Yes, don't spin 'em down! Slow and steady. There is a slight difference between rounds and flats, but not 0.001"! Are the anvils known to be flat and parallel? There are optical flats for checking that. Does the thing have any slop, taper nut (if it has one) properly adjusted?
 
Hi Conrad, no discernible slop in the threads.

Good point on the parallelism. I don't have any optical flats, though....
 
Measuring anything less than a thousandth with a micrometer is problematic for exactly the reason you state: variations in pressure.

If you read a textbook on metrology or documentation on micrometers by Starrett they will tell you that the nominal accuracy of a micrometer is plus or minus 0.0005" for a total error of 0.001"
 
Measuring anything less than a thousandth with a micrometer is problematic for exactly the reason you state: variations in pressure.

If you read a textbook on metrology or documentation on micrometers by Starrett they will tell you that the nominal accuracy of a micrometer is plus or minus 0.0005" for a total error of 0.001"

I find that hard to believe. When is that material dated? What style of micrometer? I am not going to be one of those guys that swears I can measure a tenth, but I can get closer than .001! If I have gage blocks that are certified (actual certs showing the variation from nominal) and my shop is climate controlled to the same temp specs (which our cmm lab is) I easily get within .0002-.0003 of what the gage block cert says the size is. Yes I know people here work down to sub micron level, I am not talking about that. I am not even saying I can say with 100% certainty that .500 is .500, but I sure can get my mics to repeat within a couple tenths or so!
 
Starret and manufacturers quote common micrometer accuracies at 0.002 mm, plus or minus, afaik.
About 10x better than was mentioned.

Ie plus or minus 2 microns, trending towards zero, and mostly +/- 1 micron.

My tests with gage pins confirm this, 3 sizes, blind test, electronic mic.
Spinning the mic onto the std will bend the frame, a bit.

Crap cheap digital import mics were poor, 2-5 microns variation.
Good value digital import mics were great, 0-1 microns variation, 1/33 measurements was 2 microns out.
6.000 mm, 6.100 and 6.200 mm gage pins I measured, fwiw.

The best electronic DTI have errors of 0.3 microns.
About 800€.
Mahr millimess 1002, et al..

I expect to measure to less than or equal to 3 microns typical error with digital micrometers.
A high precision spindle measures about 25.002 mm consistenly.
A dti shows the high spot, consistenly, on a v block.

Just me experiences, and reading.
 
I was doing some measuring today and I was surprised by the results.

I bought an older B & S 233 bench micrometer recently, and when I first got it, I adjusted it so that when I closed the clean anvil/spindle faces using the ratchet, it read "zero".

Then yesterday, I used it to measure a round piece(it was a master thread wire that I wanted to make sure was in the correct vial). The micrometer showed that the wire was about .001" smaller than it should have been. So I figured it was in the wrong vial. But just to double check, I pulled out a gage pin and measured it. The gage pin, too, measured smaller than it was marked, by nearly .001".

Next, I tried using a gage block, and the micrometer measurement agreed with the marked size of the block.

My conclusion is that with the same amount of pressure(using the ratchet for all measurements) the micrometer will measure an object as larger or smaller depending on how much area on the measurement faces the object is in contact with. Does this seem correct to those of you who know about such stuff?

On a related note, I also found that I could get very different readings depending on how fast I closed the micrometer-it seemed that if I spun it a little faster using the ratchet, rotational inertia of the spindle/thimble would cause it to close down a bit more, measuring between .0004 and .0008 smaller on the same gage block. Even when I was careful to close micrometer on the block very slowly, I couldn't get consistent readings closer than .0002 or so.
.
you squash or squeeze round objects more than flat objects. indicating micrometers are a micrometer with indicator to measure .00005" with much less pressure.
.
usually can measure to .0001" but you can easily get +/- .0005 readings if not careful. larger sizes are harder to measure. measuring a round bore or hub over 12" is a lot harder and very easy to get +/-.0005". most indicating gages have 3 supports to help center gage and help get repeatability.
.
try using a caliper extender with a 6 foot long bar. does not matter if caliper reads to .0005" holding a 6 foot long bar square is not easy. lucky to get +/0.002" at 6 foot length
 
Summarizing much of the above, and adding a few points:
- Pressure makes a significant difference in readings (hence ratchets, friction thimbles, actual pressure gages)
- Seeing .001" difference and assuming moderate care or "feel" suggests something else is wrong
- Check the feel as it closes to zero. Is it immediately firm or a bit squishy as it reachers zero?
- If the anvils have a spec of dirt, that could cause a pin to read larger. Carefully clean the anvils.
- Doesn't take much for the pin to have a tiny nick, a bit of grime, a slight bend either.
- Slight looseness in the threads could show up as a sort of squishy feel. An old mic is likely to have the spindle threads worn near whatever it most frequently measured (thickness of sheet? turned parts?).
- Anvils worn or out-of-parallel can cause errors. One easy check is a set of micrometer gage blocks sized to check the anvils at different degrees of rotation. The more detailed check is with an optical flat.
- The B&S 223 micrometer you have has a relatively small anvil area and early versions didn't have carbide faces -- somewhat more likely to wear. This wear is often around the edges, leaving the center a bit proud. Your pin might read a bit different if located near the center or the edges of the anvils.
- The larger thimble on the 223 is nice for reading, but as you've already noted is heavy enough to have a sort of impact problem if you try to spin it fast. That you can spin it fast might also suggest some slight looseness in the threads, perhaps especially near zero?

Could well be that after cleaning the anvils, lubricating the threads, adjusting the screw fit, and using proper "feel" you can stay within .0002" or so. Which is pretty good for an old micrometer. And, yours has the advantage of easy reading, decent ergonomics, and fewer thermal effects (since you're not holding the frame).
 
I find that hard to believe. When is that material dated? What style of micrometer? I am not going to be one of those guys that swears I can measure a tenth, but I can get closer than .001! If I have gage blocks that are certified (actual certs showing the variation from nominal) and my shop is climate controlled to the same temp specs (which our cmm lab is) I easily get within .0002-.0003 of what the gage block cert says the size is. Yes I know people here work down to sub micron level, I am not talking about that. I am not even saying I can say with 100% certainty that .500 is .500, but I sure can get my mics to repeat within a couple tenths or so!

Well, if you use the thimble and the work is perfectly flat then in practice +/- 0.0002" is achievable, but in practice what I have seen is that once you go below 0.001" there is a lot of variation. If you ask 5 different people to measure the diameter of a randomly selected gage pin in most shops you will see a variation of at least 0.0005" and in many cases significantly more. That is why when somebody gives me a mic reading I never trust it more than a thousandth. The reasons for variation:

- wear of the mic's mechanism
- wear to the anvils
- overtightening or undertightening
- uneveness in the surface of the work piece
- variation in the seating of the work piece between the anvils

You can do this experiment yourself. Just give the same gage pin to 5 different machinists where you work and ask them what its diameter is (using only a micrometer for measurement). (the gage pin should be of some odd size, not a round number, and you should allow the subjects to use their own micrometer; they should not share the same micrometer)
 
When you start talking about sub-thou accuracy and repeatability, you get into ancillary factors like heat and air currents. The heat of your hand holding the workpiece or the micrometer can add a few tenths error.

This is one reason high-resolution bench mics are mounted on stands.


- Leigh
 
Check mic with optical flats, verify mics accuracy with a certified gage blocks(handle the gage blocks with minimal touch and wear rubber/latex gloves) and I still don't believe you will achieve the accuracy you are seeking.
 
I politely disagree with some statements above.
My experiences:
Using electronic micrometers of 1 micron resolution, and gage pins.
(I have 30+ of the same brand mics, 2-6 of every size to 6". I import these myself. Quality stuff, at == 70€ each).

I used 3 sizes blind.
6.010 mm
6.020
6.030

11 readings each.
ONE reading was off by 2 microns, large, 1/33.
Half the readings were correct.
Half were off by 1 micron.

Temps were "room temp", for both gage pins and mics, of course.
Both == same temp, so any erros cancel out.

This shows that, based on empirical testing, I have over 95% confidence that the errors are +2 microns or less, trending to 0-1.

Take a gage pin.
Keep measuring it with electronic mic.
Hold it tight in your hand, in your fist, 20 secs.
Measure again.
It will still be the same size, to with max 1 micron error, and even this will be hard to do (several minutes).
Mostly, it measure the same size every time.
No way will handling a small part with fingers affect the size, significantly.
And it will always grow, never shrink.

Lot of FUD re: temp controlled rooms, gloves etc.
If an error budget of 2-0 microns is acceptable, nothing to it.

The difference is that an error budget of 0.2-0.8 microns, so that you have LESS than 1 micron error, 95% confidence, is hard to do.
0-1(2 microns) microns true size - easy.
0-1(2) microns TIR - easy.

LESS THAN 1 micron error - hard.

What you actually need, will then depend on your application.
 
Further to my post nr 12: re accuracies to 2 microns or so.

Its a specific case - and *only* appropriate to that, imo.
Small workpieces, steel or cast iron, etc.

When the workpieces get large, say from single inches 1-5, to 40"/1 m or so, the situation changes totally.
Now, a single degree C has an effect, a large one.

At 25 mm (== 1") Diameter, 3 degrees C delta T, = 1 micron growth.
At 1000 mm, the same 3 C delta T, = 40 microns, or == 0.002.

Significant error.

Thus, when making large assemblies or objects, it is obvious that you must get them to less than 1 degree C temperature difference, for errors in the single microns.

And getting small errors in measurements in the 0-2 microns sizes, is significantly more difficult.
== Pretty Hard To Do.

So, thats why its also very hard to get true positions for holes/features/datums etc to low-micron accuracies.

Small stuff == fairly easy.
Large stuff == Very Hard.

And inside hole measurement is harder, and not just TIR but roundness (lobing) becomes an isssue, etc.
 
Further to my post nr 12: re accuracies to 2 microns or so.

Its a specific case - and *only* appropriate to that, imo.
Small workpieces, steel or cast iron, etc.

When the workpieces get large, say from single inches 1-5, to 40"/1 m or so, the situation changes totally.
Now, a single degree C has an effect, a large one.

At 25 mm (== 1") Diameter, 3 degrees C delta T, = 1 micron growth.
At 1000 mm, the same 3 C delta T, = 40 microns, or == 0.002.

Significant error.

Thus, when making large assemblies or objects, it is obvious that you must get them to less than 1 degree C temperature difference, for errors in the single microns.

And getting small errors in measurements in the 0-2 microns sizes, is significantly more difficult.
== Pretty Hard To Do.

So, thats why its also very hard to get true positions for holes/features/datums etc to low-micron accuracies.

Small stuff == fairly easy.
Large stuff == Very Hard.

And inside hole measurement is harder, and not just TIR but roundness (lobing) becomes an isssue, etc.

Thermal expansion coefficients give me a headache daily, my shop makes bushings and other assorted industrial doo-dads out of cast bronze, up to about 40" dia/major dimm.
Commonly I see parts called out to 15" ID +/- 0.0005"... it's crazy! It leaves a 5.5°F window for both inspection while manufacturing, assuming the part was dead nuts in the middle of the tolerance band, and upon receipt by the customer. Plus they are castings, and they don't alway's play by the rules regarding thermal expansion.
Holding a normal inspection room at +/-3°F is pretty tough, let alone an entire machine shop, and it's not like you want to take the part out of the machine to make the measurement before the final cut.
 
Yes.

Thats why BOTH hard/expensive/costly machine and nothing-to-it, Mr shopbuilder are right.



Thermal expansion coefficients give me a headache daily, my shop makes bushings and other assorted industrial doo-dads out of cast bronze, up to about 40" dia/major dimm.
Commonly I see parts called out to 15" ID +/- 0.0005"... it's crazy! It leaves a 5.5°F window for both inspection while manufacturing, assuming the part was dead nuts in the middle of the tolerance band, and upon receipt by the customer. Plus they are castings, and they don't alway's play by the rules regarding thermal expansion.
Holding a normal inspection room at +/-3°F is pretty tough, let alone an entire machine shop, and it's not like you want to take the part out of the machine to make the measurement before the final cut.
 
Plus running down to zero can be the most used area of the screw. I prefer to have a jo block or gage round near the part to be checked. Dial micrometers and bench micrometers often check off but can be dead true as a comparator to a gauge.
 








 
Back
Top