What's new
What's new

0.001mm accuracy on a digital caliper?

Status
Not open for further replies.

ADFToolmaker

Cast Iron
Joined
Mar 12, 2012
Location
Hamilton, New Zealand
I just received a flyer advertising the new Sylvac (I think these are branded Folwler in USA) caliper, reads to microns, their claim is .003mm repeatability. What are your guys thoughts? Is this a pointless resolution for the type of gage? I am often skeptical about even the .01mm accuracy my existing calpers read, especially after a bit of use. At $400NZD ($333USD) for a 150mm it is the most expensive I have seen too.
 
...reads to microns, their claim is .003mm repeatability...
You're using terms interchangeably that mean very different things.

"Reads to microns" is totally meaningless. It just reflects the number of digits in the display.
Add six more digits to the right of the decimal and it would read to picorons.

Repeatability means that if you return the jaw to exactly the same position, the display will be
within ±0.003mm of what you saw the last time it was at that position.

Those terms are not even remotely associated with accuracy.
If you measured a 10mm gage block and got a display of 15.000mm, it would still meet the performance specs
you quoted, provided you got a reading within ±0.003mm of 15.000mm each time you tried.

- Leigh
 
You're using terms interchangeably that mean very different things.

"Reads to microns" is totally meaningless. It just reflects the number of digits in the display.
Add six more digits to the right of the decimal and it would read to picorons.

Repeatability means that if you return the jaw to exactly the same position, the display will be
within ±0.003mm of what you saw the last time it was at that position.

Those terms are not even remotely associated with accuracy.
If you measured a 10mm gage block and got a display of 15.000mm, it would still meet the performance specs
you quoted, provided you got a reading within ±0.003mm of 15.000mm each time you tried.

- Leigh

Of course you are completly right, now you have corrected me, is a digital caliper that reads mm to 3 decimal places pointess? ie, If you had a feature to check that needed to be between say 9.995mm and 10.005, would a digital caliper be a wise choice?
 
If you had a feature to check that needed to be between say 9.995mm and 10.005, would a digital caliper be a wise choice?
I would not think so.

Best practices would dictate using an instrument with a rated accuracy of ±0.001mm. That's quite a stretch for a caliper.

- Leigh
 
..
Imagine just what the heat from your hands would do to the caliper...

Actually not as bad as one would think if you test it.
Of course a mic has the same problem but a little more steel to "soak" the heat.

I do a demo for the rookies with a 1 inch gauge block showing the effect of heat on our gauges.
I hold the block in my hand for a hour while doing the tour of the shop so that it will grow.
You don't hold calipers in your hand for a hour.

Want to see something really drive your measuring nuts? Put infrared heaters in your shop or place your CMM or plate where the sun shines through a window or skylight.

My problem with calipers at these kinds of numbers has mostly to do with the wear on the beam over time which cant be adjusted back out as there is always more wear in the first inch than there is in the last inch.

New gauges are often very repeatable and accurate.
I only care about what they can do after thousands of measurements in a carbide grinding shop and how much you can trust a quick 2-3 second measurement.

If you get a "bad" number, clean everything and now get a good number and then quit you are headed for trouble.
Never measure twice, once you are rechecking do it 4 or 5 times and try to not maintain the natural "human" bias that the part is good.
Trust nothing , always think something has gone wrong in the process and the part is bad.
Place the blame on the machine, the tooling, or the process. Make the checks "prove" you have a good part in your hands.

Bob
 
No chance in hell, unless it was made by Mitutoyo. I've got this set of calipers from Mity that I picked up somewhere and the damn things are more repeatable than some of my mics. I'll call them accurate to .00025"...so about 5 microns....if it had .100 per rev instead of .200 per rev, it would probably be even better. Took a while to learn 'em, but they astound me day after day. I had a double sheer pivot pin to make today.....0.3740 hole...measured with the caliper after reaming with a 3/8 reamer that is confirmed .001 under. I polished 2/3 of the pin to 0.3735 and left the other 1/3 at 0.37425...measured with the caliper. It almost press fit home with my thumb....needed a light tap to get home. .00005" measurable slop in the assembly, using a Mity digital indicator (.00005" resolution) but it moves like its on bearings. I don't trust digital calipers much... Dials are better.
 
Trust nothing , always think something has gone wrong in the process and the part is bad.
Place the blame on the machine, the tooling, or the process. Make the checks "prove" you have a good part in your hands.
Parts are born to be bad. They want to be bad. They revel in badness, and spit on conformity. :eek:

It's the machinist's responsibility to discipline them and beat them into submission.

- Leigh
 
I seem to be the only one in this subforum (in PM?) that has tried and tested digital calipers with a display in 0.001mm.
???
All of my Mitutoyos have resolutions of .0005mm.

This is a common failing with you, Gordon. You think you're the only person on the planet with any experience.

Would I use one to measure a tolerance of ±0.005mm? No, not if I had an alternative option on hand but I would use one to measure ±0.015mm without breaking sweat.
The question was about a tolerance of ±0.005mm, not ±0.005mm.

- Leigh
 
???
All of my Mitutoyos have resolutions of .0005mm.

This is a common failing with you, Gordon. You think you're the only person on the planet with any experience.

The question was about a tolerance of ±0.005mm, not ±0.005mm.

- Leigh

Leigh, there are times you are so quick on the draw you shoot yourself in the foot. Unless I'm mistaken this thread is about digital calipers and not micrometers. If you have a Mitutoyo caliper with a display of 0.0005mm (you did write mm and not ins.) I love to know the item number.

What too is the difference between "The question was about a tolerance of ±0.005mm, not ±0.005mm."?

I'm thinking you read many of my posts as the devil reads the Bible. When I write much of what I do the way I do then it's amost always from hands on experience. I don't think I've written an opinion on anything in your sub forum based on a feeling or a hunch.

If you know anything I write to be wrong then correct me by all means. It might just be me but your remarks to me do tend to get very personal.

Gordon
 
Leigh, there are times you are so quick on the draw you shoot yourself in the foot. Unless I'm mistaken this thread is about digital calipers and not micrometers. If you have a Mitutoyo caliper with a display of 0.0005mm (you did write mm and not ins.) I love to know the item number.
OK. It is .0005", not mm. My error.

What too is the difference between "The question was about a tolerance of ±0.005mm, not ±0.005mm."?
That's an editing error. It was supposed to read
The question was about a tolerance of ±0.005mm, not ±0.015mm.

- Leigh
 
i would buy a 0.001mm resolution caliper if it did not cost much more
.
i would pay 200% more if caliper could adjust scale. that is read a 25mm gage block AND a 150mm gage block and if a reading was off adjust scale so BOTH read good. On most high end DRO there is an adjustment not just for zero but is 150mm (or 300mm) is actually reading right. The scale correction factor will adjust this.
.
as for trusting a caliper to 0.001mm. We have a OLD shop 300mm caliper that if set with jaws closed will read about 0.1mm off. The worn jaws make setting it to a gage block at the point in jaws where you need to measure. Basically short of grinding the jaws it cannot be zeroed with jaws closed and trust readings.
.
carbide jaws would help but basically any 0.001mm caliper would quickly become worn past 0.001mm in a relatively short time.
 
I just received a flyer advertising the new Sylvac (I think these are branded Folwler in USA) caliper, reads to microns, their claim is .003mm repeatability. What are your guys thoughts? Is this a pointless resolution for the type of gage? I am often skeptical about even the .01mm accuracy my existing calpers read, especially after a bit of use. At $400NZD ($333USD) for a 150mm it is the most expensive I have seen too.

I don't have the skill or experience Gordon and Leigh have. So for me, the answer is yes, this is a pointless resolution for this style of tool. Hand heat is one problem, jaw pressure/jaw deflection is the other. It takes some experience and practice to repeatedly return .001" readings on a normal caliper (in my opinion, regardless of the place of manufacture or brand).

That said, it could be that for guys, maybe Gordon for example, who need to depend on the third digit accuracy (.001"), he may not feel a normal caliper can do that and this one (which he's hinted at before) can.

In my world, I don't believe I could accurately OR precisely measure tenths with a caliper. Micrometers operate differently, etc etc, mine have a clutch mechanisms which make it easier for a ham fisted hack like me. I think the form factor of the caliper doesn't lend itself to this level of precision when handled by the likes of me.
 
i would buy a 0.001mm resolution caliper if it did not cost much more
.
i would pay 200% more if caliper could adjust scale. that is read a 25mm gage block AND a 150mm gage block and if a reading was off adjust scale so BOTH read good. On most high end DRO there is an adjustment not just for zero but is 150mm (or 300mm) is actually reading right. The scale correction factor will adjust this.
.
as for trusting a caliper to 0.001mm. We have a OLD shop 300mm caliper that if set with jaws closed will read about 0.1mm off. The worn jaws make setting it to a gage block at the point in jaws where you need to measure. Basically short of grinding the jaws it cannot be zeroed with jaws closed and trust readings.
.
carbide jaws would help but basically any 0.001mm caliper would quickly become worn past 0.001mm in a relatively short time.

If you have the block gauges then setting a micron caliper to the size you want to measure and zeroing means that you will be taking a comparitive measurement. this means that you measurement result will be more accurate than "just" a measurement. Caliper manufacturing specs state that a standard digital caliper repeatability must be within 0,01mm.

The micron digital caliper I have has carbide jaws. I haven't as yet seen onne that doesn't have carbide jaws. I have a 3mm test ring. When I measure (and others have tried too) I, as do all others, get a caliper reading of 3.000mm. Usually the internal jaws measurement is more innacurate than that of the external caliper jaws. "Not bad" he says modestly :)
If I apply excessive force I can get it up to 3.002mm. That's when your thumb starts to turn white.

re the price you give then think what a micron caliper can do. External, internal and depth. It could cost much more and still be a bargain ;)

Gordon
 
I don't have the skill or experience Gordon and Leigh have. So for me, the answer is yes, this is a pointless resolution for this style of tool. Hand heat is one problem, jaw pressure/jaw deflection is the other. It takes some experience and practice to repeatedly return .001" readings on a normal caliper (in my opinion, regardless of the place of manufacture or brand).

That said, it could be that for guys, maybe Gordon for example, who need to depend on the third digit accuracy (.001"), he may not feel a normal caliper can do that and this one (which he's hinted at before) can.

In my world, I don't believe I could accurately OR precisely measure tenths with a caliper. Micrometers operate differently, etc etc, mine have a clutch mechanisms which make it easier for a ham fisted hack like me. I think the form factor of the caliper doesn't lend itself to this level of precision when handled by the likes of me.

I don't know what "the likes" of you is like :) but I bet it'd take me only a few minutes and you'd be measuring as accurately as I can assuming you don't have 5 thumbs on each hand :cheers:

The best advice I can give is to play with your digital caliper and a small, ground round cylinder until you can repeat the same measurement again and again and again. You'll then have the "feel".

Gordon
 
In my world, I don't believe I could accurately OR precisely measure tenths with a caliper. Micrometers operate differently, etc etc, mine have a clutch mechanisms which make it easier for a ham fisted hack like me. I think the form factor of the caliper doesn't lend itself to this level of precision when handled by the likes of me.

First off calipers have a "feel" that mucks things up.
I and my guys don't have 5 minutes to measure a part.
This means sometimes you go fast.

Leighs post is interesting "parts are born to be bad".
Unfortunately too often rather than beat the parts in to submission we beat the gauge to read what we want to see.

Higher quality calipers are worth the money if you are going to use them for finish dimension and given enough care, enough time, and a new gauge there is no reason you can't hit the posted number for repeatability.
None of this reflects real world usage.
I've designed and built enough gauging systems that work fine in the lab but fail miserably on the plant floor to get the difference between what you can do and what works.

At +/-.0002 inches there is no way on God's green earth that I will trust a micrometer for measuring parts.
I have the library of 25 years worth of gauge R&R studies that prove it.
In a blind study, nobody can repeat to within a tenth let alone be accurate to this with a mic. :stirthepot: :)
I know you don't want to hear it and at one time I was on the same side as you are.

Numbers here mean jumping into very expensive gauging and few shops get this fact or are willing to spend as much on something that just measures as they are willing to spend on the machine tool. Checking parts is an afterthought.

This is always a problem for us bringing in a 20 year experienced toolmaker.
Everybody wants to think they can measure closer than they can and the argument gets heated.
We have set up blind part tests to teach the errors that your brain just does not want to see.
Much of this is designed to temp you into the wrong reading.

Take a set of marked gauge pins and blocks and have them lapped and polished .0002 to .0008 undersized.
Give these to a experience machinist and have him record the numbers.
The normal trend towards "zero" becomes quite noticeable.

How much time do you have to make a check also makes a huge difference.
The home machinist may have 20 minutes to "decide" on the number or average a bunch of checks, the production machinist needs first number good so he can punch in the offsets and load the next part.
We had to implement time constraints on our R&R studies to give us shop floor numbers we could trust.
Gauge repeatability is a complicated task and after 25 years we still find things we need to change in our procedures so that we can trust the operator's and gauges' capability.
When measuring parts you make do you clean the mic anvils with alcohol and check a gauge block before every measurement?
Probably not, but when checking the accuracy of your mic you may very well do this.

We want to make good parts, and we want to trust our measurements or we cant' make good parts.
This natural human bias distorts our view of the real world.

When testing any gauge your emphasis should be on how "bad" you can make it read.

What is the worst reading that you have ever gotten?
At some point it will happily give you this error when checking parts.

Bob
 
I wonder if anyone has thought of making digital calipers or micrometers where the display shuts shuts off when there is movement and then comes back on with a command such as pressing the hold button.:)

Seems it would help eliminate the bias factor...

I'd buy one
 
At +/-.0002 inches there is no way on God's green earth that I will trust a micrometer for measuring parts.
I have the library of 25 years worth of gauge R&R studies that prove it.
In a blind study, nobody can repeat to within a tenth let alone be accurate to this with a mic.
I know you don't want to hear it and at one time I was on the same side as you are.
I apologize in advance for going off topic, but I would like more information what you mean by "gauge R&R studies." Some background to my question is:

I have over a dozen identical small parts with holes that all should be of identical dimension, which is nominally 0.1070". I also have a Starrett bore micrometer that covers the range 0.100-0.120", with resolution on the vernier of 0.0001" (I don't have it in front of me right now so can't tell you the model number -- however, it's a "bench micrometer," in that it sits on the table, not held by the hand). To set the absolute calibration of the micrometer I have one Starrett and two Diatest ring gauges that fall within the range of the micrometer. The micrometer reads all three ring gauges to at least 0.0001" of their marked values. Prior to the measurements I took the micrometer and parts from a cabinet at ~70-72oF and let them sit on my kitchen table at ~70-72oF for an hour to semi-equilibrate, and during the measurements my hand was in contact with the dial on the micrometer only long enough to make each measurement. The micrometer has a ratchet on its spindle, and each time I rotated it until the rachet clicked twice before I read the value from the vernier.

Starting with the dozen or so brass parts (again, not in front of me, so I can't tell you the precise number of them) in random compartments in a container I removed each in random order, measured its diameter with the Starrett micrometer to within 0.0001", recorded the result, then put the first one I measured in the first compartment of a different container. I repeated for #2, etc. When I was all done I put away the paper containing the results and started all over again. I took the parts in "randomized" order from the compartments (e.g. from #7 first), measured and recorded the results on a fresh sheet of paper, and put them back in the corresponding compartment of the other container. Yes, I know this isn't truly random, but is as close as I could come to conducting a double-blind experiment on myself. Also, you'll just have to trust me that my memory is not nearly good enough to remember a dozen four-digit numbers recorded in random order over many minutes even if I were trying to, which I was not. If my subconscious memory is good enough to have done that, it has been keeping that a secret from my conscious memory for my entire life.

When done I compared the results from the two sets of measurements. Although the diameters of the holes on the different parts varied by ~0.0005", only one out of the approx. dozen pairs of measurments on the same part differed by as much as 0.0001", and it was different by only 0.0001". Anyway, for this reason I would be very interested if you could point me to published results on blind studies of measurements made with micrometers.
 
Last edited:
Status
Not open for further replies.








 
Back
Top