What's new
What's new

Using calibrated measuring equipment

Joined
May 29, 2010
Location
Denmark
More and more are having their measuring equipment calibrated. Especially of course, those in ISO 9001 (and similar) shops and companies.

I still read though that some (my guess would be mainly old-timers) prefer the use of "feel" when for example using a micrometer. This makes me wonder how many of those that prefer "feel" have seen measuring equipment calibrated by a professional? How in fact is measuring equipment inspected before leaving the manufacturer?

If the measuring equipment has some kind of "attachment" to give a constant "feel" then that is what is used. Calibration results are documented very accurately but as to whether they can be repeated using "feel" and "experience" then I doubt this as I've never seen two individuals have exactly the same "feel" or "experience". The most common response from those types is "I've always done it like that and no-one is going to teach me or tell me differently".

Of course if the measuring equipment does rely on "feel" then I've never read or heard what or how much "feel" should be applied to obtain the same result from just about all. Personally I'd appreciated if the manufacturer of measurement equipment that did rely on "feel" gave some kind of ± for what they recommended be used.
 
Interesting thoughts. I have an older Mits mic set that have tenths, but no ratchet or friction thimble. I honestly could not say how the tech 'measured' any type of 'feel' (or if that is possible)as we all know you can wrench these types of mics down to make them easily read smaller. What I do know is if I check them against gage blocks they are within a tenth or two. The only thing I can think of is when I mic something I make a conscience effort to not look at the reading as I am doing it. Don't know if that matters, but at least I am not tempted to 'make' the measurement I need,
 
Interesting thoughts. I have an older Mits mic set that have tenths, but no ratchet or friction thimble. I honestly could not say how the tech 'measured' any type of 'feel' (or if that is possible)as we all know you can wrench these types of mics down to make them easily read smaller. What I do know is if I check them against gage blocks they are within a tenth or two. The only thing I can think of is when I mic something I make a conscience effort to not look at the reading as I am doing it. Don't know if that matters, but at least I am not tempted to 'make' the measurement I need,

Keep doing that and you can't do much better.

Can't even remember the last time I saw a micrometer without a ratchet or friction thimble. 40 years plus? Are they even made any more?
 
I recently bought a new Mitutoyo 4-5 mic, and it was .002 off right out of the box, so it didn't appear to me they make any attempt to calibrate it.
So if you have 2 inspectors and one set of master mics for final inspection, who has the right "feel"? That's a good question. I would say that if you have a part where the tolerance is so tight, a tenth or two difference from feel, you probably shouldn't be doing final with a mic to start with. Gage blocks and an indicator will always prevail.
 
More and more are having their measuring equipment calibrated. Especially of course, those in ISO 9001 (and similar) shops and companies.

I still read though that some (my guess would be mainly old-timers) prefer the use of "feel" when for example using a micrometer. This makes me wonder how many of those that prefer "feel" have seen measuring equipment calibrated by a professional? How in fact is measuring equipment inspected before leaving the manufacturer?

If the measuring equipment has some kind of "attachment" to give a constant "feel" then that is what is used. Calibration results are documented very accurately but as to whether they can be repeated using "feel" and "experience" then I doubt this as I've never seen two individuals have exactly the same "feel" or "experience". The most common response from those types is "I've always done it like that and no-one is going to teach me or tell me differently".

Of course if the measuring equipment does rely on "feel" then I've never read or heard what or how much "feel" should be applied to obtain the same result from just about all. Personally I'd appreciated if the manufacturer of measurement equipment that did rely on "feel" gave some kind of ± for what they recommended be used.
.
sure torque control clutch helps but i think you confuse feel for being square and not having instrument at a angle.
.
larger sizes especially in the 6 to 18" range on round objects and bores require 10 to 100 times more care in get instrument not at a bad angle
.
thats why shallow bore gages sit on bore face and you move one end back and for to get max reading to .0001"........ usually set to ring gage that is kept in machine at coolant temperature and gage set in orientation being used at. vertical or horizontal. it helps with repeatability. high precision gages use indicator that just uses indicator pressure so there is not high or low pressures.
.
usually problem i get in measuring flat objects is .00005" waves in surface. it can effect repeating reading to .0001". many reasons for waves not the least is hardness variations and machine servo oscillation
.
i also see for example moving machine in Y direction then stop then it drifts slowly in X direction as hydraulic pressure bleeds off over 20 to 30 seconds. i often have to use G4 delays to give machine time to stabilize position. i often see a 10 ton table move in X at high rapid stop and go back and forth .0005" for a few seconds like waves in a water bed. many heavy machines use hydraulics to relieve weight so machine acts like it is on springs.
.
often bores and flat surfaces are not true and have taper, out of round, waves for example
.
calibration sticker does not mean much to me. i check all gages before use against gage blocks or ring gage. for example a tri mics bore gage if handle extension is put on often zero setting is off over .001". i never know if somebody did not take handle extension off one gage to use on another and whether the gage it came off off was checked for lack of handle extension. i was taught check gage before using it. you never know how the last person to use gage what they might have done to it
 

Attachments

  • ShallowBoreGage_1.jpg
    ShallowBoreGage_1.jpg
    39.2 KB · Views: 132
  • ShallowBoreGage_2.jpg
    ShallowBoreGage_2.jpg
    35.6 KB · Views: 143
  • BN-25a_2014July10c.jpg
    BN-25a_2014July10c.jpg
    97.3 KB · Views: 141
.
sure torque control clutch helps but i think you confuse feel for being square and not having instrument at a angle.

calibration sticker does not mean much to me. i check all gages before use against gage blocks or ring gage. for example a tri mics bore gage if handle extension is put on often zero setting is off over .001". i never know if somebody did not take handle extension off one gage to use on another and whether the gage it came off off was checked for lack of handle extension. i was taught check gage before using it. you never know how the last person to use gage what they might have done to it

I'm not disagreeing with what you write but you're taking this thread to a level I hadn't intended. What I had in mind was regular hand-held measuring equipment of the type used daily by as good as all working in a shop.

As you correctly point out, if you can't hold it correctly, then you won't get a correct measurement with it. That does require experience although some never learn.

Calibration sticker? What I was thinking about more than just the sticker was the documentation stating how far the instrument was from the correct dimension. Getting back to something as simple as a micrometer and "zeroing" it at a given dimension doesn't necessarily mean that it's also spot on at all distances. A good calibration should state at the variations from "perfect" at various intervals. Some get to see the calibration results while other "only" a sticker saying it's within spec. The question then remains "How many know what the accuracy spec in fact is?"
 
More and more are having their measuring equipment calibrated. Especially of course, those in ISO 9001 (and similar) shops and companies.

I still read though that some (my guess would be mainly old-timers) prefer the use of "feel" when for example using a micrometer. This makes me wonder how many of those that prefer "feel" have seen measuring equipment calibrated by a professional? How in fact is measuring equipment inspected before leaving the manufacturer?

If the measuring equipment has some kind of "attachment" to give a constant "feel" then that is what is used. Calibration results are documented very accurately but as to whether they can be repeated using "feel" and "experience" then I doubt this as I've never seen two individuals have exactly the same "feel" or "experience". The most common response from those types is "I've always done it like that and no-one is going to teach me or tell me differently".

Of course if the measuring equipment does rely on "feel" then I've never read or heard what or how much "feel" should be applied to obtain the same result from just about all. Personally I'd appreciated if the manufacturer of measurement equipment that did rely on "feel" gave some kind of ± for what they recommended be used.

Why all the long winded wind up?

Give us your bullshit ass sales pitch.

Than crawl back in your hole bitch
 
Meh. Feel and experience are only useful when they are gained by checking the measuring tool against a known good reference, and using that, to confirm that the feel actually gets the operator an accurate result.

Having had to endure the railings of an idiot child that was certain he could accurately measure to tenths with his digital caliper No, he couldn't, as proved by handing him a gage block to measure), even a valid cal sticker only means that under controlled circumstances, a careful worker was able to confirm that the measurement offered up as true, matched the calibration standard near enough.

Cheers
Trev
 
I' don't like "feel" but humans are actually very good at repeating it.
Problem is most people have a different feel from each other.
A zero for you is not a zero for others. On a digital most people will zero or preset for their feel and stay with that touch they use.
Hand it to another without resetting and you get different numbers. Let them remaster and things settle down a bit.
Calipers are very sensitive to feel, mics some too but not so much. Indicating mics and LVDTS take you out of the equation for the most but you can wiggle them in.
CNC CMMs and machine vision systems don't care much about people and you can't make them bias that teeny tiny bit towards zero or the limit that you want to squeak into.
No feel gauges suck as they don't have any feelings for you. They are cold black boxes with no understanding that I'm doing the best I can.
Bob
 
I don't know if this is an "urban legend" or a true fact but years ago I heard that the same thread plug gauge was sent to 3 different authorized calibration labs and all 3 results were different. This poses the question - which one, if any, was correct?

I wonder how many think about the fact that when measuring a 60º thread over wires the pressure used is less critical than when measuring a 29º (ACME) thread. The less the flank angle the more the wires tend to "wedge".

My "point" is that I don't believe I've ever seen the "feel" used in a calibration certificate specified.
 
when ever i have had my kit calibrated (as per iso9001) it was not zeroed per say, but just checked that it was accurate over a range of measurements. at a place i used to work a guy would come in to out inspection room and check all our tools against his set of gauge blocks, he was really good and would tell us where tools were getting worn and how they could be repaired or how long they were likely to keep in tolerance!

it might be different with old style measuring kit (all mine is fairly new mitutoyo digital) but it never came back reading to zero, just cleaned fresh battery's and with a new sticker on. it was up to the person using them to set it to read zero!
 
when ever i have had my kit calibrated (as per iso9001) it was not zeroed per say, but just checked that it was accurate over a range of measurements. at a place i used to work a guy would come in to out inspection room and check all our tools against his set of gauge blocks, he was really good and would tell us where tools were getting worn and how they could be repaired or how long they were likely to keep in tolerance!

it might be different with old style measuring kit (all mine is fairly new mitutoyo digital) but it never came back reading to zero, just cleaned fresh battery's and with a new sticker on. it was up to the person using them to set it to read zero!

And that's the way it's supposed to be. I think if calibration labs were also expected to "fiddle" around adjusting that would not only add on cost but also risk liability on their part.

"It was up to the person using them to set it to read zero". That certainly helps users to get to know the equipment so I'm all in favour.
 
And that's the way it's supposed to be. I think if calibration labs were also expected to "fiddle" around adjusting that would not only add on cost but also risk liability on their part.

"It was up to the person using them to set it to read zero". That certainly helps users to get to know the equipment so I'm all in favour.

So where is the sales pitch?

I mean we all know that your cheap Chinese junk is superior.
 
And that's the way it's supposed to be. I think if calibration labs were also expected to "fiddle" around adjusting that would not only add on cost but also risk liability on their part.

"It was up to the person using them to set it to read zero". That certainly helps users to get to know the equipment so I'm all in favour.

What really is zero? How is it traceable to NIST? Is it available for me to verify my zero to?

In my experience, it was only the qualified calibration tech that was allowed to make any adjustments to measuring equipment, even it is to set "zero" whatever that means. It cannot me measured so zero set is a reference to me.

It was also expected from the guys that calibrated for me, any adjustments must be approved and verified by the equipment owner/user so that everyone was on the same page.
 
I never much cared what a device said 'at zero' because I never measure a zero. Some people are paranoid about closing the anvils or jaws and making CERTAIN it reads zero. It's sometimes sabotage to /make/ your tool read zero when the jaws are closed. I mostly think of calipers, for this. The caliper could have been calibrated by checking it against master gage blocks (NIST traceable and periodically verified) over the entire range of use, from .050 to 6.000 or whatever. But who the hell checks the zero or even cares what it says with the jaws closed? To set the readout of your measuring tool based upon an arbitrary and useless position is illogical. Plus you may be voiding that calibration. I think it's just a training issue.

My opinion - haven't thought about it too deeply. Sometimes my gripes are irrational. Don't take it as gospel.
 
I never much cared what a device said 'at zero' because I never measure a zero. Some people are paranoid about closing the anvils or jaws and making CERTAIN it reads zero. It's sometimes sabotage to /make/ your tool read zero when the jaws are closed. I mostly think of calipers, for this. The caliper could have been calibrated by checking it against master gage blocks (NIST traceable and periodically verified) over the entire range of use, from .050 to 6.000 or whatever. But who the hell checks the zero or even cares what it says with the jaws closed? To set the readout of your measuring tool based upon an arbitrary and useless position is illogical. Plus you may be voiding that calibration. I think it's just a training issue.

My opinion - haven't thought about it too deeply. Sometimes my grips are irrational. Don't take it as gospel.

Not sure how to reply to that other than "I don't agree". What you describe as "paranoid" I consider "being accurate".

However if you are interested in why I don't agree then it's because measurements (as in calibration) have to start and finish somewhere. Let's take two of the most common instruments, a micrometer and a digital caliper.

A 0-1" micrometer can be zeroed and should be before calibration starts. This is done by making sure both faces are clean before starting. A good calibration will show the actual measurement result at 10 predetermined lengths on the 1" travel. This means that all variation from what the micrometer should show at the various length steps are the micrometer's inaccuracies.

Exactly the same principle applies to a digital caliper and all other measuring instruments I can think of. You start at the "beginning". With a 1-2" micrometer this means 1" and the reason a 1" rod is included in the box.

I would suggest that anyone really interested should watch a qualified calibration technician as to what and how they actually do things.

I must admit I've never really paid much attention to how Americans handle their measuring equipment but I have with many Europeans in various countries. Most are very careful and check a second time before proceeding.
 
I understand where you're coming from and I don't disagree. You're right. It should read 'zero' at zero, I just never cared. And I think one should be very careful to ensure adjusting their device to read 'the way they like it' to make sure they're making GOOD/BETTER adjustments. Usually done by the "I use my own feel" folk.

I mainly think of the people on the floor going "by feel" and adjusting their measuring devices because it didn't read 'zero' when they closed the jaws. Too often it's because there's shit on the jaws that prevent them from closing. I suppose I was mainly going a bit off topic, regarding tool maintenance and cleaning.. but that's something I always assume should be thrown in along with verifying calibration before using them, like someone said before me.

I don't think there is any real difference between "American" and "European" handling of measuring devices. I believe there is simply a difference between the "right way" and the "close enough for the shit I do in my garage" which exists everywhere in the world.
 
I never much cared what a device said 'at zero' because I never measure a zero. Some people are paranoid about closing the anvils or jaws and making CERTAIN it reads zero. It's sometimes sabotage to /make/ your tool read zero when the jaws are closed. I mostly think of calipers, for this. The caliper could have been calibrated by checking it against master gage blocks (NIST traceable and periodically verified) over the entire range of use, from .050 to 6.000 or whatever. But who the hell checks the zero or even cares what it says with the jaws closed? To set the readout of your measuring tool based upon an arbitrary and useless position is illogical. Plus you may be voiding that calibration. I think it's just a training issue.

My opinion - haven't thought about it too deeply. Sometimes my gripes are irrational. Don't take it as gospel.

Even a caliper with loose jaw gibs will realign itself to measure zero at the closed position, but all the rest of the readings taken over actual gage blocks will have a systematic error. I agree zero doesn't mean much for a caliper.

But Gordon's original question is interesting. I think to establish real standards, you'd have to remove feel, and springs and other crap that cannot be checked without further standards of comparison. Perhaps a torque arm assembly with a standard weight attached at a given radius, at a given angle, at a given temperature, with a certain lubricant viscosity, should be employed to rotate the micrometer spindle against the gage.
 
I think you have to prove your measuring system in the calibration process is robust and acceptable. You can do this by GR&R your calibration measurement process. This will shed a great deal of light on how acceptable your calibration system really is.
 








 
Back
Top