Steve H. Graham
Aluminum
- Joined
- Nov 6, 2007
- Location
- Miami
I have a question for a real metrology expert who knows something about gage blocks.
People are telling me to get gage blocks to calibrate my micrometers. Gage blocks come in different grades. The cheap ones have a tolerance of +/- 0.0005". The good ones are much more accurate. I wouldn't mind blowing $150 for +/- 0.0002", but I don't machine for a living, so I don't want to spend much more than that.
To calibrate a big micrometer, I might have to stack a bunch of blocks. If I do that, are the errors going to add up, or are they likely to cancel out so the overall error is small?
It would depend on whether manufacturers are likely to err consistently on the low or high side. I have no idea whether that's true or not, but I thought someone here might know.
If I have to stack five or six blocks to calibrate a micrometer, and I'm using cheap blocks, I could end up a thousandth or more off if the errors add up. My micrometer would be only slightly more accurate than calipers.
Someone suggested getting standards, but I notice they tend to claim tolerances of +/- 0.0002". They sound like they're not a whole lot better than relatively cheap blocks. To measure tenths reliably. I would have to send the standards in to be measured, at considerable expense. Am I wrong about that?
People are telling me to get gage blocks to calibrate my micrometers. Gage blocks come in different grades. The cheap ones have a tolerance of +/- 0.0005". The good ones are much more accurate. I wouldn't mind blowing $150 for +/- 0.0002", but I don't machine for a living, so I don't want to spend much more than that.
To calibrate a big micrometer, I might have to stack a bunch of blocks. If I do that, are the errors going to add up, or are they likely to cancel out so the overall error is small?
It would depend on whether manufacturers are likely to err consistently on the low or high side. I have no idea whether that's true or not, but I thought someone here might know.
If I have to stack five or six blocks to calibrate a micrometer, and I'm using cheap blocks, I could end up a thousandth or more off if the errors add up. My micrometer would be only slightly more accurate than calipers.
Someone suggested getting standards, but I notice they tend to claim tolerances of +/- 0.0002". They sound like they're not a whole lot better than relatively cheap blocks. To measure tenths reliably. I would have to send the standards in to be measured, at considerable expense. Am I wrong about that?