What's new
What's new

Should Gauge Block for Micrometer Calibration Correspond to the Micrometer's Maximum Measurement Size?


Jan 19, 2024
I know this may be a dumb question but I'm new to micrometers...
So if I have an analog micrometer that can measure up to 25mm (0-25mm), and only need it for measurements less than 5mm. And it came with those "calibration pins" for 25mm, but after getting two more of these pins from various sources, I noticed a ~0.03mm deviation between them, and I only need to take measurements <5mm.
Would it be possible to eliminate this 0.03mm deviation for measurement <5mm by zeroing it using a 10mm gauge block and then check the zero with 5mm and 1mm gauge blocks?

Or would I need to use a 25mm gauge block, then recheck with a 10mm, and then a 5mm and/or 1mm block?

Why aren't you trusting zero and starting there? Your standards should not be that far off. What brand and model of micrometer? Do the tips close square and sharp with only slight twist of the thimble?

Also try closing the micrometer on a business card and pulling it through to clean the tips before measuring. Your feel and repeatability of your methods is critical with a micrometer.
The "calibration pins" you mentioned are also known as "micrometer standards". They're only used as a sanity check to make sure it's reading it's "zero" point correctly. They typically come with micrometers larger than 1". When a micrometer is calibrated, it's checked at several points through its full range. Gage blocks are made to a much tighter tolerance than micrometer standards.
Gage blocks are made to a much tighter tolerance than micrometer standards.
I'd argue that this isn't a completely accurate statement depending on the grade of gauge blocks you are using. Micrometer standards by major vendors are manufactured to similar precision as gauge blocks, Mitutoyo claims +/-0.00005" accuracy on their standards which is roughly Grade 0 for a 25mm unit, and for longer standards you will find that they are more accurate as you would likely need to stack blocks to get an equivalent length and their rod shape and thermal insulation makes them pretty stable.

A 0-X micrometer is set to zero without one of these standards, you use the anvils, and using a standard to validate the extreme range of the tool is valid, but discrepancies between two of supposedly identical length is telling you that your measurement technique is wrong, or (assuming the measurements are repeatable) one or both of the standards are junk. Figuring out the truth is left up to you, but my money is always on user error.

Calibrating a micrometer is done with a specialty calibrator, and/or a set of optical parallel flats of specific thicknesses to ensure the distance at four different quadrants of the screw rotation. The flats of the anvils and perpendicularity of the screw need to be evaluated using optical fringes that the parallels will show. Unless you are a shop with a lot of these to evaluate, buying this kit and learning how to use it is unlikely to be a good investment. Send the device out for calibration, live with the uncertainty, or throw it away and get something new.

It is much more likely that the OP is just not using the micrometer correctly, as there is actually some skill required to get repeatable and accurate measurements, especially with very accurate ones. If they are really are not getting repeatable results, they should spend the money to get a nice Mitutoyo/Tesa/Etc digital micrometer which is known good and will make measurement much easier, life's too short to be measuring with junk.