What's new
What's new

Thoughts about machine certification.

PROBE

Hot Rolled
Joined
Jan 23, 2003
Location
Tel Aviv, Israel
During my 50 years long professional career I performed and supervised over 1000 calibration and certification activities of CNC machines. In vast majority of cases these activities were executed in order to prove that machine accuracy is conforming with given standard. The machine geometric accuracy (straightness, squareness, flatness) is referred by machine producers to ISO standards: ISO230-1 and ISO10791-2. These standards determine the expected result of the test (10791-2) and how and using which tools the test should be performed (230-1).
Almost all known to me machine producers, and specially the BIG O-M SOCIETY, emphasize, that accuracy of their machines exceeds by far the ISO 10791-2 requirements.

Recently I had opportunity to witness the certification activity of one the BIG O-M machines, which I was supposed to approve. The activity was performed by engineering staff of producer’s official representative. The “Certificate of accuracy”, furnished by producer, states, that all tests should be performed in accordance with ISO230-1 directions. The straightness allowed error, although specified by ISO10791-2 to 15 micron over 800 millimeters, was narrowed by machine producer to 7 microns over 800 millimeters. The test was performed using 00 class 600 MM long granite straight edge and Mitutoyo model 2110S-10 dial indicator.

According to standard DIN 876 class 00 granite straight edge 600 mm length assures straightness accuracy of 2.6 microns.
The data below is taken from Mitutoyo specifications chart of 2110S-10 indicator:
GRADUATION: 0.001 MM
RANGE: 1 MM
ACCURACY OVERALL: +/- 5 MICRON
ACCURACY RETRACE: 3 MICRONS
ACCURACY 1/10 REVOLUTION: 2.5 MICRONS
ACCURACY 1 REVOLUTION: +/- 4 MICRONS
REPEATABILITY: .5 MICRON
The accuracy of measurement in range of 1/10 revolution (disregarding retrace error) is 2.5 microns.

According to ISO230-1, “The measuring instrument should not cause any error of measurement exceeding a given fraction of the tolerance to be verified”. It is well known to any machinist, that accuracy of measuring instrument should be at least 10 times higher then measured tolerance. This is well expressed in web site
WHAT WHEN HOW, section MEASURING INSTRUMENTS (METROLOGY) : “If a measurement is desired to an accuracy of 0.01 mm, then instrument with accuracy of 0.001 mm should be used for this purpose”.

Summarizing, in order to fulfill machine producer’s CERTIFICATE OF ACCURACY statement, that tests should be performed in accordance with ISO requirements, the combined accuracy of the Mitutoyo indicator and granite straight edge should not exceed 0.7 microns !!!

Interesting was the discussion with certification team of producer’s representatives, once the above was presented to them. “But all the world is doing certifications exactly this way, using this kind of equipment !”.
May be. But here I was asked to approve, and I refused to do so. Standards are to be observed, and if one can not do that, he should not refer to them.
 
Summarizing, in order to fulfill machine producer’s CERTIFICATE OF ACCURACY statement, that tests should be performed in accordance with ISO requirements, the combined accuracy of the Mitutoyo indicator and granite straight edge should not exceed 0.7 microns !!!

Interesting was the discussion with certification team of producer’s representatives, once the above was presented to them. “But all the world is doing certifications exactly this way, using this kind of equipment !”.

May be. But here I was asked to approve, and I refused to do so. Standards are to be observed, and if one can not do that, he should not refer to them.
I think you are incorrect. "It is well known" and a website do not make for convincing arguments. Agreed that it is a rule of thumb that measuring instruments should exceed the measurements they will make by a factor of ten, but that doesn't make it true in all cases.

If the instrument is truly capable of measuring to the accuracy required, I see no reason that rule of thumb should be applied.

Can you come up with an actual standard that says "instrument accuracy must be ten times greater than the units they measure" ?

These days many people have gone away from another "rule of thumb" that you should not measure on the same machine something was made on. This one is less defensible, imo. Any error in the machine is going to be repeated when using it to measure with.

But I guess part tolerances are now so much wider than machine measuring abilities that this rule no longer applies ? Or if so, not so stringently ?
 
I think you are incorrect. "It is well known" and a website do not make for convincing arguments. Agreed that it is a rule of thumb that measuring instruments should exceed the measurements they will make by a factor of ten, but that doesn't make it true in all cases.

Can you come up with an actual standard that says "instrument accuracy must be ten times greater than the units they measure" ?

My day job is in the electrical domain. We have the same 10 to 1 rule of thumb. When you can meet the 10 to 1 rule, you can ignore measurement error. When you can't, or when it's impractical (i.e. too expensive) to meet the 10 to 1 rule you need to tighten the pass fail tolerance by the possible measurement error, so that the unit being tested meets the requirements if it passes the tests. (the semiconductor industry does this all the time and refers to this practice as guard-banding).

In your example the possible measurement error from the granite straight edge is 2.6 microns, from the dial indicator is 2.5 microns, total of 5.1 microns. They reduced the pass fail tolerance by 8 microns. They provided 2.9 microns of measurement margin.

Their test is valid and you should approve.

CarlBoyd
An engineer does for $1 what any fool can do for $2.
 
Uncertainty of measurement is a field all of it's own. The 10-1 rule is absolutely just a rule of thumb, employed when you don't want or need to, don't know how to, or are otherwise unable to calculate the actual uncertainty.

I ignore the 10-1 rule about as frequently as I impose it.
 
During my 50 years long professional career I performed and supervised over 1000 calibration and certification activities of CNC machines. In vast majority of cases these activities were executed in order to prove that machine accuracy is conforming with given standard. The machine geometric accuracy (straightness, squareness, flatness) is referred by machine producers to ISO standards: ISO230-1 and ISO10791-2. These standards determine the expected result of the test (10791-2) and how and using which tools the test should be performed (230-1).
Almost all known to me machine producers, and specially the BIG O-M SOCIETY, emphasize, that accuracy of their machines exceeds by far the ISO 10791-2 requirements.

Recently I had opportunity to witness the certification activity of one the BIG O-M machines, which I was supposed to approve. The activity was performed by engineering staff of producer’s official representative. The “Certificate of accuracy”, furnished by producer, states, that all tests should be performed in accordance with ISO230-1 directions. The straightness allowed error, although specified by ISO10791-2 to 15 micron over 800 millimeters, was narrowed by machine producer to 7 microns over 800 millimeters. The test was performed using 00 class 600 MM long granite straight edge and Mitutoyo model 2110S-10 dial indicator.

According to standard DIN 876 class 00 granite straight edge 600 mm length assures straightness accuracy of 2.6 microns.
The data below is taken from Mitutoyo specifications chart of 2110S-10 indicator:
GRADUATION: 0.001 MM
RANGE: 1 MM
ACCURACY OVERALL: +/- 5 MICRON
ACCURACY RETRACE: 3 MICRONS
ACCURACY 1/10 REVOLUTION: 2.5 MICRONS
ACCURACY 1 REVOLUTION: +/- 4 MICRONS
REPEATABILITY: .5 MICRON
The accuracy of measurement in range of 1/10 revolution (disregarding retrace error) is 2.5 microns.

According to ISO230-1, “The measuring instrument should not cause any error of measurement exceeding a given fraction of the tolerance to be verified”. It is well known to any machinist, that accuracy of measuring instrument should be at least 10 times higher then measured tolerance. This is well expressed in web site
WHAT WHEN HOW, section MEASURING INSTRUMENTS (METROLOGY) : “If a measurement is desired to an accuracy of 0.01 mm, then instrument with accuracy of 0.001 mm should be used for this purpose”.

Summarizing, in order to fulfill machine producer’s CERTIFICATE OF ACCURACY statement, that tests should be performed in accordance with ISO requirements, the combined accuracy of the Mitutoyo indicator and granite straight edge should not exceed 0.7 microns !!!

Interesting was the discussion with certification team of producer’s representatives, once the above was presented to them. “But all the world is doing certifications exactly this way, using this kind of equipment !”.
May be. But here I was asked to approve, and I refused to do so. Standards are to be observed, and if one can not do that, he should not refer to them.

I remember you mentioning this before. (In a good way).

For me personally having a machine that is compliant to ISO 230-2 (in my cases) is a "first pass" .

For the ISO 230-1 type measurements it's good to see in general terms how square and straight a machine might be, especially comparing the cheaper line of mills or lathes with the top draw line of mills and lathes of the same MTB.

Most MTB especially the Japanese and most good Taiwanese builders have test sheets (Obviously as you know) that exceed the basic threshold requirements of iso 230-1 or iso 230-2 pretty significantly in some cases to stay easily inside those failure thresholds. [I get your point about apparent "Metrological illiteracy" of some of the procedures ], but some MTBs go to better instruments and different schemes and schemes of their own beyond JIS and DIN.].

Some of the test sheets actual results can be nice but I ultimately prefer ball bar tests in three planes , laser interferometry tests for straightness, ISO test parts that are not so obviously cherry picked data, and rotary axes like 5 axis machine rotational plots, unidirectional, bi-directional with stated confidence limits (for rotary repeatability and accuracy ) + volumetric determinations like the NIS 5 axis slanted "Cone test".

AND --- importantly runout of the spindle and repeatability of the tool receptacle / interface [For example MAZAK might say an Integrex on the Capto interface has 1 micron repeatability ? What does that mean ? ].

I make clear distinctions between open loop control resolution versus least commandable increment versus where and how the machine actually moves.

Generally most manufacturers of machines are sometimes scared to share such data in fear of the ability to make actual comparisons with other machines made by different manufacturers.

Generally I find the tested values to be inside the envelope of iso-230-1 and 2 fairly substantially (under ideal conditions), in some rare cases right at the "Edge" . The posted static single value measurements presented in a brochure are IMO not very helpful. Just a vague indication that a machine might not be complete rubbish. Sometimes thin slices out of a data set are presented as actual accuracy values (particularly rotational accuracies) that give the impression a given 5 axis machine is far more accurate than it really is , and sometime a different manufacturer will be ultra conservative about stated accuracies and play it ultra safe and actually downplay how accurate a machine might actually be. [Big numeric and magnitude difference between single small angle unidirectional moves referenced to the preceding move to one Sigma confidence limit , versus a multitude of large angle bidirectional rotations stated to two sigma positional certainty … Minimally a factor four or eight difference (in rotary accuracy presentation in a brochure) depending on whether a machine has rotary scales or not.

Still ~ Laser and ball bar plots + test parts and spindle characteristics / bearing design and performance are more meaningful + calibration capabilities of the machine ,actual full rotational plots ~ and did the MTB actually get their "Math" right / using best mathematical scheme for auto-calibration of a five axis machine ?


______________________________________________________________________

* Random observations and over generalizations from my "travels" and travails.
 
According to ISO230-1, “The measuring instrument should not cause any error of measurement exceeding a given fraction of the tolerance to be verified”. It is well known to any machinist, that accuracy of measuring instrument should be at least 10 times higher then measured tolerance. This is well expressed in web site
WHAT WHEN HOW, section MEASURING INSTRUMENTS (METROLOGY) : “If a measurement is desired to an accuracy of 0.01 mm, then instrument with accuracy of 0.001 mm should be used for this purpose”.

Summarizing, in order to fulfill machine producer’s CERTIFICATE OF ACCURACY statement, that tests should be performed in accordance with ISO requirements, the combined accuracy of the Mitutoyo indicator and granite straight edge should not exceed 0.7 microns !!!

Interesting was the discussion with certification team of producer’s representatives, once the above was presented to them. “But all the world is doing certifications exactly this way, using this kind of equipment !”.
May be. But here I was asked to approve, and I refused to do so. Standards are to be observed, and if one can not do that, he should not refer to them.

As others have pointed out, you aren't following a standard, but rather a rule-of-thumb. The actual standards are, I think, the ISO guides to the expression of uncertainty in measurement and, maybe, the ISO 14253 series. This is, based on my limited understanding, non-trivial to calculate, hence the rules of thumb being used.

It's all well and good to be dogmatic, but if you're going to do so, you have to be right.
 
Thank you all guys for respond. Let’s not forget, that it is machine producer who stated, that tests procedure should conform with ISO. By the way, the tests in subject include both geometry (ISO230-1) and positioning (ISO230-2). I decided to raise here just geometry (who does not know what straightedge and indicator are). Dealing with meaning of statements in ISO230-2 is much more complicated.
The ISO230-1 statements are clear:
1. “The measuring instrument should not cause any error of measurement exceeding a given FRACTION of the tolerance to be verified” (ISO230-1 PAR 6)
2. “Known errors of the straightedge should be taken into account, in processing the measurement data”,
(ISO230-1 PAR 8.2.1.1.)

I would refrain from arguing if 9.999 is in fact a fraction of 10, and the 1:10 rule is a rule of thumb. It is clear that combined 5.1 micron accuracy while dealing with 7 microns tolerance is absolutely not allowed. Choosing proper tools while performing calibration/certification tasks would technologically be much more useful and correct. Among many others Mitutoyo LGH 542-715 gage head (accuracy 0.2 micron) and mapped error straightedge would fulfill above mentioned ISO demands and assure reliability of the tests. Metrology is simple if rules are DOGMATICALLY observed.
 
I would refrain from arguing if 9.999 is in fact a fraction of 10, and the 1:10 rule is a rule of thumb. It is clear that combined 5.1 micron accuracy while dealing with 7 microns tolerance is absolutely not allowed. Choosing proper tools while performing calibration/certification tasks would technologically be much more useful and correct. Among many others Mitutoyo LGH 542-715 gage head (accuracy 0.2 micron) and mapped error straightedge would fulfill above mentioned ISO demands and assure reliability of the tests. Metrology is simple if rules are DOGMATICALLY observed.

Except the accuracy isn't 5.1um. You actually have to calculate the measurement uncertainty, not just add numbers together.

Take a look at JCGM 106:2012 Evaluation of measurement data – The role of measurement uncertainty in conformity assessment. This standard covers exactly what you are trying to do.
 
1. “The measuring instrument should not cause any error of measurement exceeding a given FRACTION of the tolerance to be verified” (ISO230-1 PAR 6)

I've never read the standard, so I could be completely wrong here, but stating "given fraction" wouldn't mean anything to me if they don't give you the fraction. Otherwise anyone can make up whatever tolerance they want. 1/100 would be just as valid as 100/1. Does the standard specify a fraction elsewhere in it?
 
I think you are incorrect. "It is well known" and a website do not make for convincing arguments. Agreed that it is a rule of thumb that measuring instruments should exceed the measurements they will make by a factor of ten, but that doesn't make it true in all cases.

If the instrument is truly capable of measuring to the accuracy required, I see no reason that rule of thumb should be applied.

Can you come up with an actual standard that says "instrument accuracy must be ten times greater than the units they measure" ?

These days many people have gone away from another "rule of thumb" that you should not measure on the same machine something was made on. This one is less defensible, imo. Any error in the machine is going to be repeated when using it to measure with.

But I guess part tolerances are now so much wider than machine measuring abilities that this rule no longer applies ? Or if so, not so stringently ?

Mr. Georg Schuetz of Mahr Federal (hopefully convincing reference) is addressing here The Ruler of Thumb, Part 1 - Mahr Metrology
his remarks regarding the ten to one rule of thumb. " For example, in gages using analog or digital readouts, the rule says the measuring instrument should resolve to approximately 1/10 of the tolerance being measured. This means that if the total tolerance spread is 0.0002 in. (i.e., ±0.0001 in.), the smallest increment displayed on the gage should be 20 µin. A gage that only reads to 50 µin. can't resolve closely enough for accurate judgments in borderline cases, and doesn't allow for the observation of trends within the tolerance band. On the other hand, a digital gage that resolves to 5 µin. might give the user the impression of excessive part variation as lots of digits go flying by when using the display. ON OTHER HAND, 10:1 IS NOT READILY ACHIEVABLE ON SOME EXTREMELY TIGHT TOLERANCE APPLICATIONS - SAY +/- 50 MICROINCH OR LESS - AND IT MAY BE NECESSARY TO ACCEPT 5:1. BUT FOR COARSE WORK, 10:1 OR SOMETHING VERY CLOSE TO IT IS ALWAYS A GOOD RECOMENDATION." Just to emphasize - 50 microinch equals 1 micron. 7 microns tolerance in discussed case is for sure COARSE WORK. Instruments with measuring accuracy of 0.7 microns are available from the shelf, and professionals should use them while performing task in subject.
 
You have accused a manufacturer of not following a standard, and, from the sound of it, rejected a machine, because they didn't follow your preferred rule of thumb of 10:1 for Test Accuracy Ratio (TAR). You don't actually appear to be following a standard when coming up with your rejection, despite there being several standards out there. While I'm sure George knows his stuff and I have found his articles to be very helpful, they aren't a standard. It's one thing entirely to say that you wish they were using higher accuracy measuring equipment, but accusing them of not observing the standard when you are ignoring all of the standards that actually discuss how to handle this measurement seems a bit hypocritical.

On another note, what George wrote and what you did are two completely different things. You are demanding a TAR of 10:1. He is suggested that the ratio of the resolution to the tolerance be 10:1. Since most metrology equipment has much better resolution than accuracy, your demand is significantly higher than what he's suggesting. In your example, the dial indicator has a resolution of 1um or better, but an accuracy of 2.5um. Taking 1um as the resolution, George's 10:1 ratio would yield a TAR of 4:1, exactly what the old Z540 standard recommended prior to them switching to GUM.
 
My take on this is for a machine to be accepted the measurement combined with the possible test equipment tolerance band due to tool accuracy etc. must be within the stated tolerance.
 
My interpretation of your problem here....is that the ISO spec is open to interpretation!
Stating that the measuring equipment should not cause any error of measurement exceeding a given FRACTION of the tolerance, without specifying what that FRACTION is, is a badly written standard.

So that aside, I would take this approach:-
Put the FRACTION and the 10x1 rule of thumb both aside too.


What is actually required:-
Accuracy of machine required = 7 microns over 800mm

600mm straight edge tolerance = 2.6microns. Length of gauge not ideal (as it is short) but doing 3x tests by moving it (fully left of travel, in the middle, fully right of travel) would IMHO give a suitable solution - and extrapolating the straight edge error from 600 to 800mm.
BUT what is its actual calibrated value:- 2.5 or 0.1 microns or something inbetween?

The dial indicator:- Again, what is the actual calibrated value of this specific instrument (accuracy and repeatability)?
I bet it will be far greater than stated in the mitutoyo spec sheet.

Until you know these actual gauge errors, you're surely only dealing with worst case figures here?
Actual figures would be better than the stated specification sheets.

First of all, it is not me who should know actual gauge errors. I was just watching the activity. Machine producer's representative engineering team was the one who should know. But they didn't. They even weren't aware that problem exists. I from my side could only refer to instruments specifications , and assume worst case figures. Solutions were simple and exactly given in ISO230-1: use proper indicator (like for example mentioned above Mitutoyo LGH 542-715 gage head) and either mapped straight edge or a bit more advanced normal and reversed setup. Once more: Metrology is simple if rule are strictly observed.
 








 
Back
Top