Understanding uncertainty and how to report it
Close
Login to Your Account
Results 1 to 9 of 9
  1. #1
    Join Date
    Feb 2019
    Country
    UNITED STATES
    State/Province
    California
    Posts
    1
    Post Thanks / Like
    Likes (Given)
    2
    Likes (Received)
    0

    Default Understanding uncertainty and how to report it

    Good morning everyone,
    The company I work for has moved to in-House calibration for the majority of its assets. We're a manufacturing company and I am using GAGEtrak software to manage my workload. I have to report the uncertainty on my certificates, but I am having difficulties understanding if what I am doing is correct. When I use a Reference Standard on a certain calibration, this standard is referenced on the individual test point and its associated uncertainty is then added with the rest of the standards used during calibration and a value is given for Uncertainty on the report. However, certain standards do not have a statement of uncertainty on their calibration certificate (outsourced calibration) and what I have been doing is looking at the measurements reported on the certs and getting an average for the different readings or just identifying the value difference from the nominal and reporting this as the uncertainty of the reference standard. I think that this is incorrect. There are many things I'm still having a difficult time grasping my head around and I wanted to reach out to ask if there was any good resources for me to read or any advice that could be given so that I can fully understand what I am supposed to be doing.
    Unfortunately for me, my field of work was entirely different from the position I am in now and despite efforts to educate myself I have been unsuccessful in fully understanding how to report uncertainties in my calibrations. Any help is appreciated.

  2. #2
    Join Date
    May 2009
    Location
    Canandaigua, NY, USA
    Posts
    2,855
    Post Thanks / Like
    Likes (Given)
    138
    Likes (Received)
    1258

    Default

    NIST has a bunch of stuff related to uncertainty including some kind of on-line calculator. Google will turn it up if you do a site specific search. An example is https://www.nist.gov/sites/default/f...09/tn1297s.pdf but I barely understand half of it, and it seems like you have to guess at some of the numbers.

  3. Likes MindCollizion liked this post
  4. #3
    Join Date
    Oct 2013
    Location
    Rochester, NY, USA
    Posts
    114
    Post Thanks / Like
    Likes (Given)
    4
    Likes (Received)
    43

    Default

    A mostly understandable uncertainty explanation can be found here:
    An introduction to expressing uncertainty in measurement

    A very easy to use spreadsheet with a good example tab can be downloaded here;
    https://nrc.canada.ca/sites/default/...t_template.xls

    If you follow the example on the NRC spreadsheet, you will get most everything right. Note; the uncertainty values do not add, because they are standard deviations- the spreadsheet handles this correctly.

  5. Likes MindCollizion liked this post
  6. #4
    Join Date
    May 2010
    Country
    DENMARK
    Posts
    3,365
    Post Thanks / Like
    Likes (Given)
    4030
    Likes (Received)
    12604

    Default

    Quote Originally Posted by MindCollizion View Post
    When I use a Reference Standard on a certain calibration, this standard is referenced on the individual test point and its associated uncertainty is then added with the rest of the standards used during calibration and a value is given for Uncertainty on the report.
    The way I tend to look at the "uncertainty factor" given by some calibration labs is that nothing is perfect. The "uncertainty factor" is taking all possible logical measurement uncertainties into account. Accuracy of apparatus used, temperature etc. Depending on the situation you're in it can be regarded as wearing both a belt and braces or useful information.

    Maybe oversimplifying but if you buy a gauge then it won't be exactly at the max and min tolerance. There's also a manufacturing as well as a wear tolerance. It's also why you could risk getting parts rejected by a customer unless you are both using the same gauge.

    I made this over 10 years ago for M36 thread gauges to show the possible legal tolerance variation re dimensions.

    http://f-m-s.dk/Thread%20gauge%20tolerances.pdf

  7. #5
    Join Date
    May 2017
    Country
    FINLAND
    Posts
    1,709
    Post Thanks / Like
    Likes (Given)
    482
    Likes (Received)
    791

    Default

    Quote Originally Posted by MindCollizion View Post
    However, certain standards do not have a statement of uncertainty on their calibration certificate (outsourced calibration) and what I have been doing is looking at the measurements reported on the certs and getting an average for the different readings or just identifying the value difference from the nominal and reporting this as the uncertainty of the reference standard. I think that this is incorrect.
    This is a common problem with factory calibrations and non-accredited cal labs.

    Maybe easiest(but not cheapest) way is to ask for ISO17025 accredited calibration. It always comes with measurement data and calibration uncertainty, not just some pass-fail statement or sticker slapped to your equipment. And the lab should be qualified to calculate the uncertainty properly.

    Sometimes you can guess (and hope) that cal lab has maintained proper TUR(test uncertainty ratio) and used reference that is at least X times more accurate than calibrated instrument specification. But it is still not same as calibration uncertainty and if your work is under any scrutiny (external audits or whatever) you have hard time explaining it.

    Depending on what standard you are working on the requirements can be confusing or even contradictory.
    Lots of calibration labs in US still work with TUR(test uncertainty ratio) like ANSI/NCSL Z540 standard. From your description it sounds like you are working with total calibration uncertainty that is more of a ISO 17025 standard/ecosystem and it doesn't mix&match with ANSI Z540 easily.

  8. #6
    Join Date
    May 2010
    Country
    DENMARK
    Posts
    3,365
    Post Thanks / Like
    Likes (Given)
    4030
    Likes (Received)
    12604

    Default

    Quote Originally Posted by MattiJ View Post
    This is a common problem with factory calibrations and non-accredited cal labs.
    I've never seen any manufacturing factory give measurement uncertainty. It'd certainly add to the cost as each item would have to be tested individually.

    There's are manufacturing standards for micrometers, calipers and indicator dials. Most manufacturers live up to those standards.

  9. #7
    Join Date
    Oct 2007
    Country
    SPAIN
    Posts
    3,409
    Post Thanks / Like
    Likes (Given)
    1903
    Likes (Received)
    1242

    Default

    Quote Originally Posted by MindCollizion View Post
    However, certain standards do not have a statement of uncertainty on their calibration certificate (outsourced calibration)

    and what I have been doing is looking at the measurements reported on the certs and getting an average for the different readings or just identifying the value difference from the nominal and reporting this as the uncertainty of the reference standard.
    I think that this is incorrect.
    1. You NEED the calibration house to state their level of uncertainty. It should be on their Cert.
    If they check a gauge block and it's 1.0000, what good is the cert if the way they are checking it has +/-25% uncertainty?

    2. My understanding here, is by you taking deviations from nominal, you are making the assumption that the nominal is correct.
    But how do you know the nominal is correct? That's the point of a calibration.
    So i'd say yes, you are being incorrect, because you need their stated uncertainty.

  10. #8
    Join Date
    May 2017
    Country
    FINLAND
    Posts
    1,709
    Post Thanks / Like
    Likes (Given)
    482
    Likes (Received)
    791

    Default

    Quote Originally Posted by Gordon B. Clarke View Post
    I've never seen any manufacturing factory give measurement uncertainty. It'd certainly add to the cost as each item would have to be tested individually.

    There's are manufacturing standards for micrometers, calipers and indicator dials. Most manufacturers live up to those standards.
    Maybe less common for ”end user” products like calipers but relatively common for lab standards.

    New gage block set is useless if it doesn’t come with calibration certificate with stated uncertainties (ok, not useless but first thing you would need to do is to send it for calibration in case you claim any sort of traceability for your mesurements/calibrations)

    Beamex pressure gauges and calibrators for example all come with 17025 cal cert(with uncertainties as it is 17025 cal).
    Mitutoyo sells some of the gage block sets with cal cert like this https://shop.mitutoyo.eu/web/mitutoyo/en/mitutoyo/01.07.18/Gauge%20Block%20Set%2C%20Metric%2C%20JCSS%20Cert.% 2C%20ISO/$catalogue/mitutoyoData/PR/516-338-60/index.xhtml;jsessionid=3324C82AD149389902A18331B6C 13130

    JCSS is japanese 17025 accreditation so certificate needs to come with stated uncertainties.

  11. #9
    Join Date
    May 2010
    Country
    DENMARK
    Posts
    3,365
    Post Thanks / Like
    Likes (Given)
    4030
    Likes (Received)
    12604

    Default

    Quote Originally Posted by barbter View Post
    1. You NEED the calibration house to state their level of uncertainty. It should be on their Cert.
    If they check a gauge block and it's 1.0000, what good is the cert if the way they are checking it has +/-25% uncertainty?

    2. My understanding here, is by you taking deviations from nominal, you are making the assumption that the nominal is correct.
    But how do you know the nominal is correct? That's the point of a calibration.
    So i'd say yes, you are being incorrect, because you need their stated uncertainty.
    I'm certainly agreeing with you on this. Several years ago a thread gauge was sent to three different calibration labs. The pitch diameter measurements on the certificates issued were not identical as each facility used their own equipment but none used the same equipment.

    The results weren't far from each other (a few µm if I remember correctly) but still different. With the measurement uncertainty factor taken into account I believe the thread gauge was approved by all three as being within tolerance. Which result however was "spot on"? That'll forever remain a mystery.

    How much may a thread gauge vary and still be approved? This shows the pitch diameter tolerances on standard M36 gauges. Probably more than most realize.

    http://f-m-s.dk/Thread%20gauge%20tolerances.pdf


Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •