What's new
What's new

Rule of 10:1 Inspection Ratio Questions

Piper3T

Plastic
Joined
May 9, 2016
Hello,

I am evaluating the equipment in our inspection department. I am trying to do a simple 10:1 ratio sanity check on some of the equipment we use for tight tolerance parts (.001 or less). We do a lot of it for size and position of features.

I've read a lot of literature about the ratio and I'm confused about what number to use for the ratio: resolution of the device? Bi-directional accuracy? Total accuracy range (double the bidirectional value)?

For example. In the Mitutoyo catalog there is a simple OD mic.
Resolution: .0001 (vernier scale)
Bi-directional Accuracy: +-.0002
Total Accuracy range: .0004

None of the literature I've read is explicit about which value to use. They say interchangeably: resolution, accuracy, precision, etc.
I am leaning towards resolution of the device. But then I'm not sure if that accounts for the accuracy, which is always larger than resolution.

Any clarification would be appreciated.

Thank you.
 
*For inspecting parts a dial indicating micrometer and a set of Jo blocks takes much of the question out of checking.

[For example. In the Mitutoyo catalog there is a simple OD mic.
Resolution: .0001 (vernier scale)
Bi-directional Accuracy: +-.0002
Total Accuracy range: .0004]

*looks reasonable, marked in tenths, normal accuracy +- .0002, That equals a range of .0004
Touch and feel on a mechanical micrometer can vary .0001 or so.

loose measure for inspecting parts under 1/2 thousandths IMHO..
 
this is my take:

10:1 on however your measuring. it's unitless but method specific.

if your part tol is +/- .001 then your equipment should be +/-.0001
if your part tol is +0/-.001 then your equipment should be +0/-.0001 (which is to say +/- .00005)

10:1 gives you good process control. This is what you want if you wish to be all scientific about cpk, minimizing scrap in production, making real inspection test reports etc...


using your example. In the Mitutoyo catalog there is a simple OD mic.
Resolution: .0001 (vernier scale)
Bi-directional Accuracy: +-.0002
Total Accuracy range: .0004
I would use this to measure parts tolerances +/- .002 at 10:1

3:1 is fair for inspecting most parts in my opinion, for my uses, as the sole inspector and operator of said equipment.
 
Thank you for the responses.

The reason I am looking into this is because some people in the company use very large measurement tools (up to 60 inches) and expect the same accuracy as the smallest tools. They are not in the inspection department but make decisions.

For example, we have many OD that are 36 inches and up. Tolerance is +-.0005 and all we have for measurement is a large vernier caliper.
The same can be said for many other types of features and positions.

From your response dsergison, I still run into the same question.
"if your part tol is +/- .001 then your equipment should be +/-.0001"

I am confused about the "your equipment should be." I'm assuming you mean bi-directional accuracy?
Some other literature I've read seems to state that its all dependent on resolution of the device's scale and the accuracy does not play into it.
"The ten-to-one rule recommends that the measuring instrument resolve to approximately ¹⁄₁₀ of the tolerance being measured."

Since resolution is always smaller than accuracy, it could mean I select an insufficient tool for the job.

This is my confusion, sorry if I am not understanding correctly.
 
I'm sorry but it's 10:1 off the worst case error.
You can live with 8:1 and if willing to give up a fair bit of your tolerance 6:1.

Final part prints are not manufacturing or process prints. Floor prints must give up room for the floor measuring being used.
3:1 is almost unworkable. Here on a .+/-001 part a +.0004 reading is maybe scrap (or not, who knows) and any size adjust in a machine is a guess.
One can reduce the error band by averaging multiple readings but it's hard to remove the human part here and this only works with computers or automated gauges.
Life is so much easier if you actually have 10:1 systems and can know +/-.001 parts at +/-.0009 on the gauge are good.
It's hard to have too much repeatable resolution but it gets expensive fast. Most will go with "feels good and right".

Your mentioned mic "might" be usable at +/.002 if handled right and used by people trained. well.
This 10:1 crap, gauge R&R, etc are a miserable thing once you get into them.
For some it is about holding and adjusting a process, for others making sure you ship no bad parts, for others still it's good enough to work for the customer.

40 years back 10:1 plain gauge resolution alone would have been fine. Not now.
Bob
 
Piper3T --

Let me refer you to a discussion on this board a few years back: http://www.practicalmachinist.com/vb/metrology/vernier-dial-digital-calipers-250384/

One posting in the discussion says Starrett claimed the uncertainty in their vernier calipers of 0 +/- 0.0005 inch per foot, which is, of course, 0 +/- 0.0015 inch per yard, and I think we'd all agree that Starrett vernier calipers are generally very well regarded.

Another (mine) cites the obsolete U S Federal Specification, which requires main-scale graduation errors not exceed 0.0003 inch per foot, and vernier-scale graduation errors no greater than 0.0002 inch. In the worst case, these errors would sum arithmetically, to 0.0011 inch over 36 inches.

John
 
I'm confused about what number to use for the ratio
It's the ratio of the accuracy spec of the measuring instrument to the spec you want to certify.

For example, if you want to certify to +/- 0.001", your standard must be +/- 0.0001" or better.

Simple as that.

There's a good reason for that ratio.

Measuring instruments have numerous small sources of error, and they're cumulative.

By using a standard that meets the 10x ratio, you can ignore those small errors.

If you can't meet the 10x ratio then you need to enumerate and evaluate each error term.

This is standard practice in the science of metrology, not just for mechanical measurements.

- Leigh
 
I am confused about the "your equipment should be." I'm assuming you mean bi-directional accuracy?
Think of an automobile speedometer.
It's a unidirectional indicator. If you back up, it reads zero.

But if you back up at 60mph in a 30mph zone you'll get a ticket.
The cop doesn't care which direction you're going.

The same is true of a measuring instrument.
If you're only concerned with unidirectional error for a given part, that's the part of the spec you use.

If your part is +0/-.001" and your standard is +/- 0.0001", you're just barely there.
The fact that the part tolerance is unidirectional does not change your instrument to +0/-0.0002".

The terms "bi-directional accuracy" and "uni-directional accuracy" are invented by us to describe an application.
Those terms do not exist in the definition of a measuring instrument.

- Leigh
 
Thank you for the responses.

The reason I am looking into this is because some people in the company use very large measurement tools (up to 60 inches) and expect the same accuracy as the smallest tools. They are not in the inspection department but make decisions.

For example, we have many OD that are 36 inches and up. Tolerance is +-.0005 and all we have for measurement is a large vernier caliper.
The same can be said for many other types of features and positions.

From your response dsergison, I still run into the same question.
"if your part tol is +/- .001 then your equipment should be +/-.0001"

I am confused about the "your equipment should be." I'm assuming you mean bi-directional accuracy?
Some other literature I've read seems to state that its all dependent on resolution of the device's scale and the accuracy does not play into it.
"The ten-to-one rule recommends that the measuring instrument resolve to approximately ¹⁄₁₀ of the tolerance being measured."

Since resolution is always smaller than accuracy, it could mean I select an insufficient tool for the job.

This is my confusion, sorry if I am not understanding correctly.
Not just 10x resolution but 10x certified accuracy. This drives you very quickly to very expensive tools.
 
Last edited by a moderator:
The 10:1 ratio isn't a neccessary part of metrology, it's a convenient rule of thumb that allows you to ignore the errors of the standard used for the measurement.

When you are working to the limits of your capabilities you apply corrections for the measured errors of your measuring equipment as a part of the job. Ever wonder why gauge blocks come back from a calibration with a list of measured errors and a calibration uncertainty?
 
Maybe someone can answer this for me. If 10:1 is the rule of thumb (like Mark said), if your tolerance is +/-.005, you only need a tool capable of reading +/-.0005, correct? So if you are inspecting that part/tolerance with tenth indicator it is a waste of time, yes? The reason I ask is we had some parts with +/-.005 that qc inspected showing some of them were .0001-.0003'ish under the low limit. Wouldn't it be more practical to use a half thou tool and round up/down, or does that 'violate' quality procedures and such?
 
Maybe someone can answer this for me. If 10:1 is the rule of thumb (like Mark said), if your tolerance is +/-.005, you only need a tool capable of reading +/-.0005, correct? So if you are inspecting that part/tolerance with tenth indicator it is a waste of time, yes? The reason I ask is we had some parts with +/-.005 that qc inspected showing some of them were .0001-.0003'ish under the low limit. Wouldn't it be more practical to use a half thou tool and round up/down, or does that 'violate' quality procedures and such?

Limit is the limit.. but a +- .005 with a .0001 perhaps .0005 error past is most likely still a go part. One might have to make a call to a customer. But one would not say we think we are about .0005 off then the customer does not know where you are...

For practical purpose a +-.005 checked with a .001 micrometer would show .0005 easy enough to a skilled tool guy making the part.

like a barn door with 1/16" being .oo6 in error
 
Limit is the limit.. but a +- .005 with a .0001/.0002 error is most likely still a go part. One might have to make a call to a customer.

For practical purpose a +-.005 checked with a .001 micrometer would show .0005 easy enough to a skilled tool guy or inspector.

I (in theory) agree that that out of tolerance is out, but still.... In my mind had he used a more 'practical' method, he would have said something like "Well we have alot that are on the low limit of the tolerance, we need to get it fixed next time." Instead of "These parts are bad, I'm holding them for approval."
By the way, it was not a machined feature, for that matter, it could have been dimensioned +.01/-.02" and been perfectly fine. I know that because I machine all the parts for the assembly and know what the function of said feature was used for.

The consensus (upper management) was he should not have 'flagged' the parts at all, should have just addressed the issue for next run.
 
Upper management likely knows what the customer will accept but does not want the floor to know or they will get too loose...

The 36" part might go at +- .015 and the floor is expected to try for close.
 
Maybe someone can answer this for me. If 10:1 is the rule of thumb (like Mark said), if your tolerance is +/-.005, you only need a tool capable of reading +/-.0005, correct? So if you are inspecting that part/tolerance with tenth indicator it is a waste of time, yes? The reason I ask is we had some parts with +/-.005 that qc inspected showing some of them were .0001-.0003'ish under the low limit. Wouldn't it be more practical to use a half thou tool and round up/down, or does that 'violate' quality procedures and such?

If your gauge uncertainty goes to 5/10ths then your max good accept number now goes to +/-.0045 on a +/-.005 print.
Part still fails as it should.
Bob
 
1.000 +/- .005 means your parts must fall within 1.00500000.. and 0.99500000..

It's just a real kick in the balls when you know they wouldn't have -caught- that one part that measured 1.005100 if they would have used a less-precise instrument that yielded only 1.005. Your inspectors are right. It just sucks.


The 10:1 rule of thumb is nice. But like everything, you have to know your tools. Just because your caliper reads to the tenths, and you measure a part and it tells you 1.0045... you have to know what it /could actually be/ and that caliper still read that number. This will tell you if you are -certain- it is a good part or if it -might- be a good part, and you may need to check it with something more precise to ensure you don't exceed tolerable limits.

Leigh and Bob sound like they have a much better handle on this topic than I do - I tend to go cross-eyed when it gets too deep in the gage science. Just not my cup of tea, I suppose.
 
1.000 +/- .005 means your parts must fall within 1.00500000.. and 0.99500000...
The limits are INclusive.

So if your measurement is 1.005000000... including your measurement uncertainties, the part passes.

Leigh and Bob sound like they have a much better handle on this topic than I do
I worked full-time as a calibration tech in a commercial lab for several years while going to night school.

We certified stuff for customers who were really picky, like NASA, DOD, and DARPA.

- Leigh
 
I'm not a metrologist and I could be completely wrong. But, if I understand correctly we're talking about making sure on the shop floor that parts having diameters of 36" or more would be within +/- 0.0005" from the nominal dimension (I assume at a specified temperature).

Personally, I'd be much more concerned with thermal expansion than with the precision and resolution of the measuring tool: Assuming that both the material being machined and the measuring instrument have exactly the same coefficient of expansion, a difference of temperature of just 2°C (3.6°F) between the two would throw the measure off the required tolerance.

Considering the magnitude of error introduced relatively minor temperature changes for a shop, I think the part can be correctly measured only in a temperature-controlled lab, after any potential heat generated by machining has been completely dissipated in the surrounding environment. Any measurement taken on the machine, is very unlikely to be accurate enough. Again, due to the magnitude of measuring errors associated with thermal expansion, I would be skeptical that we could still use the 10:1 rule. I think the thermal expansion error must be properly calculated and factored in for both the mart and the instrument.

Paolo
 








 
Back
Top