What's new
What's new

zero setting standard for Mitotoyo micrometer

swellwelder

Stainless
Joined
Sep 21, 2002
Location
Valley City, ND USA
My new(to me) 7-8" Mitotoyo micrometer came with this 7" standard, but stamped into the barrel of the standard is the actual size which is listed as 7.000015"(yes, that is the correct number of zeros) This is a .0001 micrometer, so why go that much further in accuracy, when the tool can't reliably be that accurate?

Dale
 
15 millionths is more than 1/10 of 100 millionths. (1 ten-thousandth is 100 millionths.) So it's starting to get into the range where it might matter on a tight tolerance.

They don't expect you to read the mic to 10's of millionths. They expect you to figure the tolerance band on everything in the process, including your measuring gear, then establish the acceptable range of readings you get on that mic. If you have a dimension toleranced as 7.5" +0-0.002, and you zero your mic with that 7.000015" standard, then you cannot accept a part that reads 7.5" on the mic. Ideally, you'll also have determined the repeatability of that particular mic, and that too would figure into what's acceptable.
 
Maybe you will use the standard some day to set the travel on a machine using a 20 millionths indicator. If so you will be able to fudge the .0000005 part!
 
My new(to me) 7-8" Mitotoyo micrometer came with this 7" standard, but stamped into the barrel of the standard is the actual size which is listed as 7.000015"(yes, that is the correct number of zeros) This is a .0001 micrometer, so why go that much further in accuracy, when the tool can't reliably be that accurate?

Dale

You should probably have posted that in Metrology.

When there are that many zeroes in inches I prefer to think metric and it is given to 0.4µm above 7". If you want to use something to calibrate something else it should be preferably at least 10 times as accurate as what you want to verify. IOW to accurately verify (calibrate) a 0.0001" micrometer what you use should be accurate to 0.00001".

Use it to check your micrometer now and then and you have no worries. The accuracy for that standard isn't unusual, in fact less accurate would be.

Gordon
 
Yes, you definitely want your standard to be more accurate than the instrument it is used to calibrate. It is simply a small Jo block.
 
Speaking loosely, steel expands or contracts 15 microinch over 7 inches if / when its temperature changes 1/3 degree Fahrenheit.

And that's why calibration labs let the part (in this case a standard) plus what is being used to measure stay in the same temperature controlled environment for at least 24 hours. Any self respecting calibration lab also writes what the "measurement uncertainty" is in the calibration certificate. No measurement is truly "exact".

The bottom line is that the more accurate you want your measurement/calibration result to be the more care and thought that has to be taken. As far as a standard rod and a micrometer goes then, as I've already written, you should be good to go in the shop.
 
Given that the current standard discussed measures .000015 over nominal, as stated by the manufacturer, it makes me wonder what my 1", 2" and 3" standards made by same are in reality.
 
And that's why calibration labs let the part (in this case a standard) plus what is being used to measure stay in the same temperature controlled environment for at least 24 hours. Any self respecting calibration lab also writes what the "measurement uncertainty" is in the calibration certificate. No measurement is truly "exact".

The bottom line is that the more accurate you want your measurement/calibration result to be the more care and thought that has to be taken. As far as a standard rod and a micrometer goes then, as I've already written, you should be good to go in the shop.

Measurement uncertainty wouldn't apply to temperature Gordon as such, all instruments I've come across are calibrated at 20 C. Should they be calibrated at a different temperature for some bizarre reason this wouldn't be an "uncertainty" this would be a known that would be calculated and the size compensated for what it would read at 20 C. The uncertainty adjustment would be for the rate of length adjustment for the temperature adjustment. Slightly different materials expand and contract slightly different amounts with temperature. Since the calibration lab wouldn't know the precise alloy of the steel used, hence the precise expansion/contraction amount, they would instead use a default amount (as alluded to by John above) and attach an uncertainty to that adjustment.

Hopefully that all makes sense.
 
It's worth noting that Mitutoyo literature states that the JIS / DIN permissible-error tolerance for Grade 0 gage blocks between 150 mm and 200 mm nominal length is +/- 0.50 micrometer (aka micron) [ http://www.mitutoyo.com.sg/documents/manuals/individual/5-1_Gauge block.pdf ]. Since 7 inch is between 150 mm and 200 mm, and 15 microinch is less than 0.50 micrometer, the micrometer standard in question is within the JIS / DIN Grade 0 gage block tolerance band for its length.
 
So back in Samiworld (AKA the real world) ................when it gets to that # of places after the decimal point, he's long lost the will to live!

P.S. And you can take that anyway you like.
 








 
Back
Top