What's new
What's new

Can I report my employer for incorrect quality practices?

Actually, calipers can be just fine, the technician must develop the "Feel" that matches the instrument reading to the desired feature dimension. Measuring a confirmed good part repeatedly (with eyes close to the read out) is a great training tool.

I find frequent "re-calibration" of my own "feel" is part of any measurement protocol.
 
And from China lol

Really though, +/-.001 with calipers is about where I'd draw the line...a good set, I check mine with gage blocks periodically on the flats and tips of the jaws. The tips are the easiest to get worn or bent. I also check for light through the closed jaws...if they fail they get for reference only written on them and I go to gage services for a new set. Someday they'll get me carbide like I ask for :)

Get carbide tipped dial calipers. I have cheap Chinese ones that agree with my gauge blocks within about a tenth.
 
I started doing quality inspection at a new company several months ago... Is there some official agency, ISO for example, I can report my company to for practicing these inaccurate methods...?
Report it to your local unemployment agency. It's a good place to be for a snitch, IMO. Or be a man and discuss the issue with your boss openly (and quit if not satisfied).
 
. . . I've seen two identical Starrets compared to a Millennium Series Limited Edition Mitutoyo.

Even though one Starret has been used a lot and the other one almost never used, they read very close.
The brand new Mitutoyo is about .0005 less on a standard. . . . .

If either a micrometer or a caliper reads .0005" less than the standard, the response is to calibrate the tool. Might even be that the OP's ISO manual, like most, has a procedure for routinely calibrating instruments -- and would either re-calibrate or take a tool out of service that didn't meet spec.

It's quite common, for both 0-1" micrometers and 0-X" calipers to show discrepancies between their closed "zero" position and measurements further out. That's why mic checking standard sets use a variety of gages, to catch different rotations. Ideally, one uses a standard (with both micrometers and calipers) that's close to the measured dimension. This won't turn calipers (or most micrometers) into .0001" accurate instruments (the 10x rule of thumb for instrument accuracy vs. inspection tolerance) -- but as Milland said it should help remove some sources of error.

Personally, I think the OP's bigger problem is that he's been hired to be the expert at his job - but his new company clearly doesn't think he's worth listening to. He can either work to gain the needed respect and understanding (turning the company into some mythical ISO police force isn't the answer) or find another workplace that values his input.

In this case, could be he's right -- that a part being, say, +/- .0018, rather than +/-.001 would catastrophic for the customer. If so, bravo for keeping the customer's needs in mind. They pay all our bills. Could also be that he's making a mountain out of a molehill?

The QC guy most companies want will find a way to measure the desired accuracy quickly and cheaply - and even help pinpoint and solve process capability problems should they become an issue.
 
It takes a bit of practice to get a good and somewhat accurate/repeatable feel with calipers. Some guys have a very hard time with it. I would trust a good set of calipers to .001" when used as a comparator with an on-size gage block. That's me myself. There are other guys I've worked with that I would not trust to be able to hold +/-.003" or maybe worse.
 
I.... That's me myself. ...
Strange how we trust ourselves in measuring such but do not have that same trust in others.
A bias perhaps?
I thought I was really, really good and careful with calipers and mics. Even had a belt holster for my favorite 0-1.
Then I got into the gauge building world and got my butt spanked.
 
Strange how we trust ourselves in measuring such but do not have that same trust in others.
A bias perhaps?
I thought I was really, really good and careful with calipers and mics. Even had a belt holster for my favorite 0-1.
Then I got into the gauge building world and got my butt spanked.

No, actual comparison of parts measured by myself and others using a caliper to subsequent measurement by micrometer. Some guys just don't have the knack or feel. Obviously it's important to use the right tool for the job, and to understand proper usage of the chosen tool. Is a caliper the right tool for OPs job? Couldn't say, not there myself, but unlikely. Can a caliper be used to measure to .001"? Sure, under certain circumstances (quality caliper, compared against a standard) with the right user. Is it going to be right there, every time with every user? No.
 
Strange how we trust ourselves in measuring such but do not have that same trust in others.
A bias perhaps? . . .

Could be both. The vast majority of us think we're above average. Half a dozen names for this psychological bias (superiority bias, Lake Woebegone effect, Dunning-Kruger bias, etc.).

In the case of micrometers, if a user has developed "feel" they will get somewhat more repeatable measurements. The problem is that one user's "feel" will be different than anothers - you can't run a plant-wide inspection system that way and expect interchangeable parts. "Calibrating" one's own feel to a standard helps. Ratchets, friction thimbles, and pressure indicators largely solve that problem across multiple users - the "feel" is built into the instrument itself.
 
I would not trust calipers to +/-.001, for the record I don't have the hands of a brain surgeon or piano player.

Better to have the brains of a hand surgeon than the hands of a brain surgeon, Novadays they all use the DaVinci robots.

https://www.davincisurgery.com/

Makes delicate things a lot more repeatable, they compensate for shaky and clumsy hands.

Not sure of the hands of a piano player, maybe a violin needs more nimble fingers or better touch.

But yeah for one thou calipers are iffy
 
Could be both. The vast majority of us think they're above average. Half a dozen names for this psychological bias (superiority bias, Lake Woebegone effect, Dunning-Kruger bias, etc.).

In the case of micrometers, if a user has developed "feel" they will get somewhat more repeatable measurements. The problem is that one user's "feel" will be different than anothers - you can't run a plant-wide inspection system that way and expect interchangeable parts. "Calibrating" one's own feel to a standard helps. Ratchets, friction thimbles, and pressure indicators largely solve that problem across multiple users - the "feel" is built into the instrument itself.

Maybe you can calibrate your own feel against a standard or CMM, but you cannot run a plant on feel alone :). Hence CMM beats them all, I guess CMM just does not give a shit about feel-ings
 
Could be both. The vast majority of us think we're above average. Half a dozen names for this psychological bias (superiority bias, Lake Woebegone effect, Dunning-Kruger bias, etc.).

In the case of micrometers, if a user has developed "feel" they will get somewhat more repeatable measurements. The problem is that one user's "feel" will be different than anothers - you can't run a plant-wide inspection system that way and expect interchangeable parts. "Calibrating" one's own feel to a standard helps. Ratchets, friction thimbles, and pressure indicators largely solve that problem across multiple users - the "feel" is built into the instrument itself.

That's what the gauge blocks are for. If I want to measure a feature that's say 4.500" +/- .001", and I want to use a caliper, I should use that caliper to take a reading on a 4.5000" gauge block stack. When I have a feel and calibration that repeatedly (over several readings in a row) gets 4.500" on the caliper, with the needle lining up dead nuts on the 0 line, I have suitable gauge R&R to reliably measure the part. This is why I'll trust a dial caliper more than a digital; you can see a tenth reading variation on the dial, but only .0005" on the digital.
 
Last edited:
Could be both. The vast majority of us think we're above average. Half a dozen names for this psychological bias (superiority bias, Lake Woebegone effect, Dunning-Kruger bias, etc.).

In the case of micrometers, if a user has developed "feel" they will get somewhat more repeatable measurements. The problem is that one user's "feel" will be different than anothers - you can't run a plant-wide inspection system that way and expect interchangeable parts. "Calibrating" one's own feel to a standard helps. Ratchets, friction thimbles, and pressure indicators largely solve that problem across multiple users - the "feel" is built into the instrument itself.

Although the ratchet or similar mechanisms make some progress towards solving inconsistency in measurements between users it is certainly not universally so. With many measuring tools, alignment of the tool to the surface being measured isn't built in - for instance, the narrow faces on a dial caliper, or a snap gage with a mounted dial indicator.. That alignment is just as important as the proper pressure when measuring. If the tool isn't aligned correctly the measurement will still be off even with proper pressure on the measuring faces. Both are important.

I feel sorry for those who always automatically think they're better than everyone. I always wonder if I'm as good. Tends to drive continual improvement and a desire to learn more.
 
Technical issues aside, why would you want to "report your employer" ? Seems kind of counter productive to me, but …………………… What's the real goal here?
 
I have posted this here before but will ask.
Is "feel" on a gauge block being all nice and flat different than "feel" on a turned or ground shaft?
Is this worse with calipers as the nice block sort of squares up the jaws when checked?
 
Ages ago at a place I worked the guy running two CNC chucker lathers had to cut up his own stock on an automatic saw.
To check the slugs he kept a pair of Chinese digital calipers sitting by the saw, depending on how hard you squeezed them the total variance was .010.
 
I have posted this here before but will ask.
Is "feel" on a gauge block being all nice and flat different than "feel" on a turned or ground shaft?
Is this worse with calipers as the nice block sort of squares up the jaws when checked?

I'd say there's a difference, and there can be a lack of parallelism in the jaws as lapped. But you can mitigate this a bit if you measure just the "corner" of the gage block, and at a few different heights along the jaws. Or use a couple "X" class plug gages.

Almost always worse with the inside jaws, so I check just the tips and then down a bit in the setting ring.

Last, be sure the gib screws on the slide are adjusted correctly, that's a classic source of error.
 








 
Back
Top