What's new
What's new

CMM or Mic Accuracy Question

Micmac1

Cast Iron
Joined
Jun 9, 2017
Say i have recieved parts, and i am measuring an OD with a calibrated mitutoyo micrometer. I find parts to be .0002-.0003 undersize. My supplier used a CMM to inspect these parts, they were measuring .0003-.0008 over my measurment. We do not have a cmm to compare. I have checked with 2 different calibrated mics, 1 brand new, same readings. Are all my mics junk and is the cmm really that much more accurate? i cant imagine these mit mics being that far off. Yes i rechecked the micrometers to gage blocks and setting pin and they measure dead nuts.
 
OD 2.125 dia, mics measure parts round within .00015 and indicator with vee blocks has same result (was ground on centers). I checked all over but unsure of where they checked it with cmm
 
If the parts are turned and the roundness is good (not thin wall / egg shape) sounds like the CMM they were using needs a probe qualified.
 
I'd tend to trust your mics as the correct dimension. Just because the CMM can measure to 0.00015 doesn't mean the operator is getting the right dimension. Qualifying, as mentioned, could be an issue, or operator error on setup. I have seen anomalies because the operator was doing something stupid.

Also assuming that whoever is measuring on your end isn't cranking the mics down like they would a C-clamp. Incorrect ratchet or friction thimble use could account for the difference.

Even if the CMM is off, there still needs to be some agreement or resolution between you and the supplier on what is the correct procedure and dimension or what is acceptable.
 
What was the temp that they checked the parts at and what is the temp you are checking them at? Another possibly is that they brought the parts in and inspected them hot off the machine with no soak time.
 
CMM model? How programmed?
Digital mic or vernier?
Surface finish? Mics take tops off things and CMMs do not. CMMs have a lot of ways of calculating a diameter and those give a different numbers.
This is where you take a part back and forth until there is some basic agreement on size.
Bob
 
CMM model? How programmed?
Digital mic or vernier?
Surface finish? Mics take tops off things and CMMs do not. CMMs have a lot of ways of calculating a diameter and those give a different numbers.
This is where you take a part back and forth until there is some basic agreement on size.
Bob

Do not know model of cmm or how programmed. Finish aprox Ra16 finish , Measured with digital micrometer mit 293-332-30CAL to be exact (used 2 different mics same model) zeroed on recently calibrated gage blocks
 
If it's a manual CMM I would not trust the accuracy to better than .001". The operator has to be skilled with how fast he/she brings the probe into the part. Send them the gauge pin you used to check your mics. See what they get with the same operator. Better yet, stand there and watch him or her to make sure they measure the pin as carefully as the part.
 
The parts could be out of round (trigon) and still measure the same "diameter" at any angle. The CMM can then be set to measure the inscribed circle, the circumscribed circle, or one of several different functions based on several touch points.
 
This mic will exceed most normal CMMs for a measurement.
But. lobed. I'll assume lathe and not a centerless as those users know this measurement problem.
No matter who right or wrong or what is considered a diameter. ... Work with the customer and not against them.
How do I make my acceptance criteria match yours? They are the ones writing the checks.
I am a gauging nut to the extreme but have learned that this fight is rarely good.
Get the hows and whys you and them have a divide on this. Perhaps both sides have a lesson.
If into such things this is sort of of a sweet whats and whys so I do much like the post and it's problems.
Bob
 
Having dealt with a similar issue and having had my own $$$ CMM in a temp controlled workshop that could measure this, I can tell you that any or all of the above suggestions could be the case.

The CMM, if automatic and programmed properly (generally if using the OEM software then it will adjust it's speeds accordingly) and if used in a temp controlled room (well controlled, within a few degrees variation), AND if the part is heat soaked sufficiently, AND if the CMM has been also heat soaked (ie, they didn't turn the A/C off when they left the night before then ran it right when they got in that day), then it's most likely:

1) Calibration error on the probe
1a) Tip may have a flat spot(s) (these just occur over time and with rough operators)
1b) Probe may not be a scanning probe and it's measuring a tri-lobe due to inherent variation in off-axis measurement internal to the probe
1b) Probe may be a scanning probe that hasn't been calibrated or has an inherent variation issue and along with scanning at too high a linear speed for the rotation speed of the probe, may be scanning a sinusoidal-ish pattern from the probe error that's increasing the effective diameter measurement

2) Operator might be measuring something like the above mentioned inscribed circle, while if you are using ANSI Y14.5-2009 or other Y14.5, the roundness defines a complete profile and no point may exit the high or low profile. Inscribed circle will always capture the peaks, not valleys. It won't capture point failure.

3) Somehow you have always measured in low spots with the mic and your average is trending lower than the true average

4) Your part may be at a lower temperature when measured than when the CMM is measured. BTW I assume you specified a temperature for the part dimension to be measured at. Your functional requirement temperature is one thing, but the part dimension for inspection must be that at the Q/A facility room temp and pressure. If you're facility colder, you will measure smaller.

We had a 2µm repeatable accuracy on our CMM and Renishaw scanning probes consistently measured a 1.5µm lobing on sub-micron round parts due to the probe design. We repeatedly had the CMM company calibrate the CMM, but they don't calibrate by scanning, just by touching 3 points to get a circle. The probe would rotate the tip in each spot for an averaged location, which eliminated the tri-lobe error perfectly. The issue was that when you scan, the CMM doesn't stop and perform at least a 120° rotation at each point, it is maintaining a continuous travel to avoid introducing vibration and rolling the tip along the surface to reduce tip wear. The downside of saving a few bucks on replacing tips every few weeks and getting higher CMM throughput is that you would get the weird sinusoidal measurement from rolling the tri-lobe error onto the surfaces. On runout measurements you got this interesting soft peaks and valleys measurement at the micron level, that looked like you were measuring surface roughness, when actually you were measuring the Renishaw design flaw.

This was years ago, I hope they've since solved this, but the probes are $$$ so I imagine a great many of these are out there.


Good luck, dealing with this level of accuracy in a Q/A dispute often leaves no winners.
 
Sort of what has been said. To add, is the CMM traceable to NIST, is your mic? Most CMMs are only capable of say 1 um. So, .000075. I do have a P&W Supermic. That is more accurate than all but a few CMMs. Also, there can be Bias between different measuring equipment, or even between CMMs.
I did work on an issue between 3 different CMMs. 10 um on a perpendicularity. Basically, 3 um between each one. Also, where points were taken.
As some have alluded to, filtering is an issue. It is more than just inscribed or circumscribed.

So, accuracy of the CMM. Probe calibration. Alignment as true circle vs kilter to an Oval. Tempeture. Filtering of measurement.
As to a micrometer. I am not sure of what mic you are using. I know you listed it. If a bench mic, that OK. Operator feel for the mic. Still, traceable to NIST? You are taking measurements from 2 points.

I did look at the mic listed. We use them for reference. I deal with microns usually. So, 2.5 microns per 1 tenth. Some of the parts that I deal with have a 1 micron total tolerance. The CMMs I have are not rated for that. We rate the one at 5 microns and the other at 10 microns. I did do a personal study of the 5 micron CMM vs a MFU 100 that does diameters with a tolerance of +/- 250 NANOs. I was consistently 0.5 Microns low ,might have been high. So a bias.
 
Are the mics calibrated/zeroed off of gauge blocks or a round master?
There can and almost always will be a difference on a hand held mic.
Have you ever done the basic standard blind repeatability tests on said mics?
I give this mic +/- 0003 at sixes if used well, maybe .0002. I give 60% of CMMS in use worse that that.
Now the world gets confusing.
Temp. How much temp change to see this .0007 size change on this 2.xx dia part?
The OP is not working in the micron world and just wants to make good parts the customer will accept and under the gun of my numbers do not match yours and all that entails.
How to get out of the problem?
All nice to have big dollar CMMs and clean rooms and bow down to those that do but not so much in the OP's budget.
Bob
 
How long are you holding on to the mike, do you have insulating gloves on or is this a indicating micrometer on a stand?
I messed with a coworker as he was measuring carbide punches to .0001, when his back was turned I had my hand on the mike for about 15 seconds. He always laid the mike on the granite table to keep the temp the same.
This guy was really good and precise and it confused him for a bit till I let on. The measurement was known but was being rechecked.

Dave
 








 
Back
Top