What's new
What's new

Is 0.00005" tolerance for std. micrometer reasonable?

greif1

Aluminum
Joined
Oct 1, 2013
Location
Rochester, NY, USA
I see that Mitutoyo 293 series (std. 0-1" digital micrometer with 0.00005" resolution) have a factory spec of 0.00005".

This seems rather extreme to have the tolerance be the same as the resolution. What are everyone's thoughts?
 
I had an indicating micrometer. Bayer I think, .00005" resolution. I used a master gage pin (±.00001) to recalibrate every time I used it. Temperature changed it enough to make a difference. I t was the only tool I had at the time to check a ±.0002" outside diameter.
 
What model exactly?
This one for example says 2um accuracy:
https://shop.mitutoyo.eu/web/mitutoyo/en/mitutoyo/1305876921890/Digital%20Micrometer/$catalogue/mitutoyoData/PR/293-821-30/index.xhtml

Sometimes seen in specification: "accuracy doesn't include contribution from resolution" or something similar yadda yadda.
 
We have those, the ones that read to half millionths. I doubt they are that accurate (for actual size) but they do repeat pretty well. I mentioned in another thread maybe .0001-.0002" repeatability....

edit: I have a mechanical Mits drop indicator graduated in 1um!! :eek: I have never used it to measure anything, inherited from my father-in-law.
 
I see that Mitutoyo 293 series (std. 0-1" digital micrometer with 0.00005" resolution) have a factory spec of 0.00005".

This seems rather extreme to have the tolerance be the same as the resolution. What are everyone's thoughts?

Not so unusual. Much the same my Hamilton ten millionths mechanical old enough it was still being made BY Hamilton before Dorsey bought the line, machines, and even hired the people.

What it sez to me, either way, is to take multiple readings, write them down,

Do multiple settings checks, write those down, too.

Then apply eyeball or even "the maths" to ascertain if you have stable numbers in a tight spread that repeats. Or need to try again, carefuller.

Temperature was mentioned. Then how clean was clean? How stable the positioner(s)? How stable the very Earth, vibration-wise. How careful the operators?

"Etc".

Touchy stuff for any form of "contact" measurement heavier than a beam of light.
 
We use the 293s at work and they're very good. I bought one for home too. Still, as said above, technique is everything. Under ideal conditions they repeat to the last digit. Less than ideal conditions, not so much.
 

My first-ever mic that wasn't "company" property was a gamble.

Foreign-made? Satin chromed? Looked nicer than a proper Brown & Sharpe (Starrett was for hobbyists, poor folks, or "company issued" metrology, back in the day..)

But for NINE DOLLARS? Was it any GOOD?

Who TF ever heard of "Mitutoyo" anyway?

That damned nine-dollar Mitutoyo DOES have a sort of goldenish frost to the edges of the frame from wear.

60 years later.

You might have to "sell" some folk on Mitutoyo. I surely ain't among 'em!
 
I see that Mitutoyo 293 series (std. 0-1" digital micrometer with 0.00005" resolution) have a factory spec of 0.00005".

This seems rather extreme to have the tolerance be the same as the resolution. What are everyone's thoughts?

.
normal if you want to measure .0001" you need resolution of 1/2 or .00005"
.
obviously with a indicating gage its easier to see out of round or flat, waviness of part and taper easier as you
move tool. many a flat surface when indicated with a .0001" indicator you can easily see waviness over .0001"
 
We have those, the ones that read to half millionths. I doubt they are that accurate (for actual size) but they do repeat pretty well. I mentioned in another thread maybe .0001-.0002" repeatability....

edit: I have a mechanical Mits drop indicator graduated in 1um!! :eek: I have never used it to measure anything, inherited from my father-in-law.

Errm, that's 50 millionths, not half millionths. :) Maybe you were thinking half-tenths... And 1 micron is just a tiny bit finer than 50 millionths - 0.00004" vs 0.00005" (39 millionths actually, vs 50 millionths)- so almost equivalent.

You might be able to get close to holding +/- 50 millionths with a mic that reads in that resolution but it's going to take some care and frequent checking of an accurate reference - preferably immediately prior to each measurement. This way you aren't relying on the "accuracy" of the measuring device, but just using it as a comparator.

This is the nice thing about analog measurement vs. digital when trying to measure at the resolution limits of the measuring instrument - you can interpolate between the lines with analog. Digital is either one increment or the next.
 
If they had the same mic as a bench model, in a temp controlled room, with all the usual precautions, I wouldn't be surprised if it accurately repeated to half a tenth. But hand held on hand held parts, shop floor, I'd put a piece of tape over the last digit- just a distraction.

I have a Mit digital the reads in microns. Hate it. It's unwieldy. I get a lot of prints with tolerances of +- 2 microns on some features, way more comfortable with my analog Mit mics doing that. Most parts are under 3mm diameter and the mics are calibrated on plug gages in that range. It's fun to show the guys just out of school how accurate their eyes are on an analog mic.

If I know the customer is really going to break balls I double check with a Cary indicating bench mic.
 
..... What are everyone's thoughts?

People really, really want to trust the gauge they bought.
Few approach it for the side of trust nothing and prove worst case error in real use.
.0002 R&R is maybe out of the range of this mic in normal tested use on the floor and I have more than a handful.
That said I do very much like this micrometer.
:popcorn:
Bob
 
People really, really want to trust the gauge they bought.
Few approach it for the side of trust nothing and prove worst case error in real use.
.0002 R&R is maybe out of the range of this mic in normal tested use on the floor and I have more than a handful.
That said I do very much like this micrometer.
:popcorn:
Bob
like bob said I comfortable with 0002 as well. we have the coolant proof ones. I verify mine with deltronic pins a few times a day and always the pin closest to the part size I am checking.
I have 4 of them and they repeat better than any mic I have used with the exception of the pratt and whiney supermic ;)
My customers inspectors sold me on these mics. I know if I get xxxx reading he gets the same thing 90% of the time 5% of the time its temp difference and the other 5% of the time its I fat fingered it. ;)
 
This is the nice thing about analog measurement vs. digital when trying to measure at the resolution limits of the measuring instrument - you can interpolate between the lines with analog. Digital is either one increment or the next.

"Back in the Day".. when digital was first being brought into existence.. we were more steadfast about proper descriptions. A display was "3 and a HALF" digits.

Because.. there is always one more stage in the electronic guts that makes the "decision" on the "half" point, how often "sampled", how much inertia as to reporting the change measured to the display.. at which point(s) it commanded a digit to flip by one FULL "count".

Useful at improving stability. Reduces eye-confusing flickering. Has been KNOWN since Big Bang of "digital" to potentially obfuscate, even "outsmart" the human as the price of convenience.

The experienced human eye could AT LEAST have seen the difference of 1/4 division, 1/2 division, 3/4 division, if not slightly better, yet.

Not just "majority rule" after smoothing out the bumps so as to not annoy the easily annoyed end-Lusers.

Convenience is a nice thing to have. Just don't ever lose sight of the price it charges.

2CW

... and BOTH types of metrology. Of course.

:D
 
I have the Mitutoyo high-accuracy digimatic mic (MDH-25MB), resolution to 0.000005 inch (yep 5 millonths of inch), but one must manage temperature around this mic (gloves and no breathing ;) ) to get these values.....
 
like bob said I comfortable with 0002 as well. we have the coolant proof ones. I verify mine with deltronic pins a few times a day and always the pin closest to the part size I am checking.
I have 4 of them and they repeat better than any mic I have used with the exception of the pratt and whiney supermic ;)
My customers inspectors sold me on these mics. I know if I get xxxx reading he gets the same thing 90% of the time 5% of the time its temp difference and the other 5% of the time its I fat fingered it. ;)

Honest question... Isn't a deltronic pin not a good way to check a mic reading? My thinking when getting into .0001-.0002" measurement, if the mic anvils aren't "perfect" you could theoretically get a different reading at the front/back left/right? Wouldn't a gage block and the deviance cert (aka your .100 block is actually .00003" undersize) be a better way, especially if you are checking mostly flat parts?
 
So, 1/2 a tenth. Or 1.27 um. I would question the heat factor etc. as has been brought up. Yes, a P&W Supermic in a temp regulated room is nice. I still like to run stuff on the CMM. Gives Form Fault, etc. Mines not rated to that, but I find with good program alignment, I can get good results.
 
Honest question... Isn't a deltronic pin not a good way to check a mic reading? My thinking when getting into .0001-.0002" measurement, if the mic anvils aren't "perfect" you could theoretically get a different reading at the front/back left/right? Wouldn't a gage block and the deviance cert (aka your .100 block is actually .00003" undersize) be a better way, especially if you are checking mostly flat parts?
I've noticed this with the small diameters I work with. I've rectified a few with a bronze disc lap and diamond paste, did help a lot but tool forever.
 
Deltronic pins, unless you specify otherwise, come class X, and are 0+.00004", which makes them a plus pin, unless otherwise specified. - (I think it was an S pin?)
 
Honest question... Isn't a deltronic pin not a good way to check a mic reading? My thinking when getting into .0001-.0002" measurement, if the mic anvils aren't "perfect" you could theoretically get a different reading at the front/back left/right? Wouldn't a gage block and the deviance cert (aka your .100 block is actually .00003" undersize) be a better way, especially if you are checking mostly flat parts?

Deltronics pins for mic checking when doing lathe work ie round and gage blocks when mic checking doing mil work. sorry should have clarified that since most work thats tight tol is in the lathe
 








 
Back
Top