What's new
What's new

Where micrometer standards ever made in .500" increments?

garychipmaker

Cast Iron
Joined
Dec 2, 2005
Location
ia
I work in the Maintenance department of a factory. the reason I ask if standards were ever made in .500" increments is if such as using a 4-5" Mic. I always check it before using as 3 other guys share them. So if the mic. has been used at 4.500 and i want to check it with a standard you run it .5" either way to check it. and maybe have to use it at 4.500" again. And no I don't trust not checking it
 
+1 on what Milland said. Gage blocks of varying size can be "wrung" together to make standards of pretty much any size you wish.

The closer your standard is to your desired size, the better. Say if you want your part to be 4.557" you can use a combination of blocks to make a standard that exact size if you have a complete set of blocks.

I have also seen them referred to as "Johannson" or "Jo" blocks, namesake of C.E. Johannson, the Swedish inventor who developed this method of gaging.
 
If it only is used at 4.500", a 4.500" standard would tell you if there was wear at that point. However, more generally, wear could occur elsewhere so your micrometer could be perfect at 4.000", 4.500" and 5.000", all of which would be at full turns of the thimble, but have a periodic error every, say, half-turn of the thimble (e.g. 4.125", 4.375, etc.). For this reason I have a Brown & Sharp 12-piece set, accurate to +2/–0 µin., that is made specifically for this purpose, with individual gauge blocks to test deviations at various partial turns of the thimble.
 
The main purpose of the standards supplied with micrometers is to set the "zero" point, not to check their accuracy. So a 1" or 25mm micrometer does not come with a standard. But a 1" - 2" micrometer comes with a 1" standard and a 25mm - 50mm one comes with a 25mm standard.

As others have said, if you want to check the accuracy of a micrometer you should use gauge blocks (Jo blocks). Even the lowest grade Jo blocks, often called "shop blocks" are good enough to check micrometers. And just checking them at whole number of rotations is not good enough. They should be checked at other intervals, like half and quarter turn points at the very least.
 
A really poor practice is cranking a one-inch micrometer down to zero to check zero.
That is a lot of wear on the threads cranking all the way down..and banging into zero stop a million times wears that one place so much that it becomes not in tune with the rest of the threads.

Nobody ever checks zero sized parts so accurately setting zero at zero is the least accurate thing one can do with a micrometer. Good to find or make and dead size .2500 .5000 and .7500 and .1000 to keep in your toolbox for your reference checking gauge. IMHO.
 
PS: I would question your logic. The parts of a micrometer that govern it's accuracy are all internal except the anvils so they are well protected from damage. And no one can wear the thread out in a day or even a week or month. On the other hand, the standard or gauge blocks are completely exposed and therefore a lot more liable to be damaged.

If I got a sudden change in the reading on a mike standard or a Jo block (stack), I would sooner suspect that the standard or Jo block is the offender, not the micrometer unless I had total knowledge of how that standard or Jo block had been handled between the good and bad readings.
 
If it is a one user micrometer, then there is little or no reason to constantly check the zero. I set the zero on my first micrometer around 50+ years ago and it has not changed. But I am the only one who handles or uses it. And I always close it to zero when it is not in use.

BUT, I do not BANG it. It has a click clutch and I always use that at a SLOW speed to close the micrometer on parts and to it's zero.

On the other hand, if it is a shop tool that is passed around to several or many people as needed, then I can see a reason for checking the zero before use. Not everyone is as careful as I am.

But NO BANGING! EVER! Close it slowly with the click or friction clutch. It's a micrometer, not a C clamp.

Another good reason for never "banging" a micrometer closed is this can change the reading. If you close it rapidly momentum can take over and over-tighten it to the point of distorting the frame of the micrometer. Or even slowly over torquing it can do the same. You can go a full thousandth or more past the correct reading. The clutches are there to prevent this. Or, in their absence, proper hand technique must be used.



A really poor practice is cranking a one-inch micrometer down to zero to check zero.
That is a lot of wear on the threads..and banging into zero stop a million times wears that one place so much that it becomes not in tune with the rest of the thread.

Nobody checks zero parts so accurately setting zero at zero is the least accurate thing one can do with a micrometer. Good to find or make and dead size .2500 .5000 and .7500 and .1000 to keep in your toolbox for your reference checking gauge. IMHO.
 
Last edited:
I don't think storing your 1" micrometers closed down to zero is a good idea.

Mitutoyo says, "When storing the micrometer, always leave a gap of 0.1 to 1 mm between the measuring faces. Do not store the micrometer in a clamped state."
 
Sort of: Brown & Sharpe make a set (it ain't cheap) for checking micrometers. Includes a pair of optical flats for checking flatness and parallelism of the spindle and anvil.
s-l1600.jpg
 
I never store my 0-1's closed up. I check my mics from time to time if I'm working on something close, but I've never had one of my personal mics lose its zero. Many of those I've had for near 30 years. Actually I take that back - one of my 0-1's lost its zero once after I let a younger kid in the shop borrow it. I went to use it after he brought it back and it was a full thousandth off. I immediately started looking for trouble by measuring a gage block and I could literally feel that something wasn't right. Turned out that he had dropped it and the anvil and spindle centerlines were no longer coincident/parallel. I told him he'd better set some money aside from his next paycheck. Carbide-faced spindles, tenths vernier, friction thimble. Order it and hand it to me by the end of next week, or you'll never touch another of my tools. He did.

And I don't feel like running the spindle in and out a lot will cause much wear as long as the mics are properly maintained. I take mine apart, clean and oil the threads around once a year or so. I've never even needed to adjust a nut to take up clearance in my mics. Run 'em dry and you are in for trouble, IMO. I won't use a mic that will free spin the way I've seen some guys like them. Way too loose for me, and a sure sign that they're not oiled.
 
I'm in the "open" camp on 0-1 mic storage, which I believe came from advice I read ages ago from one of the major makers of the dang things. Not that it's germane to the OP's post, but eh...
 
Last edited:
A really poor practice is cranking a one-inch micrometer down to zero to check zero.
That is a lot of wear on the threads cranking all the way down
Huh?
That's what it's made for, to move it from 0 to 1" and any range in between....for decades. "Cranking" it down to 0 isn't going to wear the threads. I don't even think the cheapest hunk of chinese shit would wear the threads.
 
I have old micrometers that the feel of the anvils to zero is not the same as checking a Joblock at all other measures. The only answer with carbide anvils is that the thread at that endpoint has been damaged
Likely you ask most apprentices (and some older guys) and they set the zero by closing the micrometer.
I do have a lab-grade set of jo bocks, and have made gauges closer than jo blocks.

It is easy enough to prove..take the oldest 1" micrometer and feel zero, then measure a certified Jo Block to see if they match.

If your .1/2 Jo Block is +6 millionths then your at-zero, and at-500 should show the same, in my experience older micrometers do not.

This is a good reason to have shop (certified) measuring tool..and not everybody's tools when doing tenth or sub tenth work. Just having a ccm is not good enough because often finding off-parts is a big waste of time.
 
Last edited:
Likely you ask most apprentices and they set the zero with closing the micrometer..
The "zero" on an old, non-digital micrometer can be set anywhere using a gauge block, but how else are you going to set the zero on a digital micrometer without closing the micrometer?

Also, unless someone's blood is at 68 oF (or whatever the shop temperature is at), thermal expansion of the micrometer's frame will change the zero during the day. The change may not be enough to matter for whatever part is being measured at the moment but, if someone only cares about measurements to the nearest ~0.001", calipers are easier and faster. If someone does care enough about a particular measurement to be using a micrometer, checking the zero, or the reading at some value other than zero using a gauge block, immediately prior to making a measurement they need to be accurate to 0.0001" is essential.

Making measurements that are truly accurate to 0.001" isn't trivial, and making them to 0.0001" requires even more care. Gloves, micrometer stands, a micrometer standards set, temperature control, ratcheting or friction thimbles, etc. all become relevant when trying to achieve absolute accuracy of 0.0001", and essential when using one of the latest 5 micro-inch micrometers anywhere near the limit of its capabilities.
 
I set my indicator micrometers to a lab grade Jo block for the part I am measuring, so to use the micrometer as a comparator to the Jo Block/stack. One can't run a few millionths sizes off the set at zero and travel up the threads. IMHO.
When I would take a part to the inspector I would know exactly what he would find.
My plate check or device check should be the same as his check or we would be running a shoe factory ..call it what size it comes out.

Re: That is why making the old stamped number Jo Blocks was a true craft..the grinder/lapper had to hit that size.
 
Last edited:
  • Like
Reactions: pb1
I set my indicator micrometers ...
Indicating micrometers are very useful but, although they have "micrometer" as part of their name, they are a different device than the micrometers the OP asked about. An indicating micrometer is zeroed on a standard and then deviations from that zero depend on the accuracy and linearity of the dial indicator that is part of the device. Commonly that accuracy is claimed to be ±0.0001", although some claim twice that accuracy. That said, the OP only seems to be concerned with values to the nearest 0.001", an accuracy which certainly is easier to achieve with a micrometer than 0.0001" or better.
 
At the last place I worked at I spent quite a bit off the time on “ marking out and inspection “ . Part of your duties was check all the measuring equipment that was in use in the shop. Company owned and personal property. If you didn’t want your own tools checking you had to take them home. This was done every six months and everything had to be noted down. It wasn’t a job I liked. Checking loads of 6”-12” and 12”-24” mikes at every setting gets very boring really soon. That’s before we got to the 5” -125” inside mikes. You very really found an error but everything had to be gone through.

Regards Tyrone.
 








 
Back
Top