What's new
What's new

Basic measurement questions

music321

Plastic
Joined
Jan 6, 2016
Hi,

I have two basic measurement questions.

First, if using a measuring gauge, I wonder from what point to what other point measurements are taken. For instance, if using an ordinary ruler, should the object being measured be aligned with the center of the lines, or one of the edges? All these years, I've simply tried to maintain consistency. Given the potential for varying thicknesses of lines on gauges, etc., I'd assume that measurements should be taken from the middle lines. Now that I'm hoping to measure with precision, this is something I really should know for sure.

Second, I have a pair of calipers. I don't know how to find the thickness of things with precision. For instance, if I clamp a pen between the the jaws, I can take a measurement of the diameter. However, if I slowly increase the distance between the jaws, I can see that the digital readout is indicating an increase in distance, yet the pen will still held between the jaws. There's obviously compression at play, so I gradually open the jaws until the pen falls out. I assume that this is as precise a measurement as I'm going to get.

For objects that can't be picked up between the jaws, this becomes more problematic.

These instances aren't really important, they just illustrate a general problem. I'd guess I'd have the same sorts of problems when using highly precise micrometers, etc. So, what are the general guidelines? Is there a youtube video on this? Thanks.
 
Let's start with a general statement. The instrument that is being used measure with needs to be 10 times more accurate and precise than the value being measured. To read a micrometer to .001", the micrometer needs to be capable of being read to .0001"

If you are trying to split hairs on a rule, you are asking for precision that's not there. First let's define the two terms, accuracy and precision. Accuracy is how close a value is compared to an absolute value. An absolute would be like the standard meter which is the length of the path traveled by light in a vacuum in 1/299 792 458 second.

Precision can be defined in terms of acceptable deviation. A yard stick is often graduated in fractions of an inch. Carpenters would find that graduation is ok for framing a house, but a machinist needs something finer.

Four measurements were taken with a tenth micrometer by two persons

Value 1.002 absolute
Person A Person B
1.0024 1.0020
1.0027 1.0021
1.0026 1.0019
1.0018 1.0019

In the case if person A is the value reported as 1.002 or 1.003? In the case person B the value reported is 1.002.

Same part, same mic, one observer is more precise than the other.

Which leads to your second question, how firm to hold the mic? For really sensitive measurements where pressure is critical, special mics have been developed to control pressure. These are generally not shop grade tools. For the general machinist, it's a matter of experience.

Tom
 
Hi,

I have two basic measurement questions.

First, if using a measuring gauge, I wonder from what point to what other point measurements are taken. For instance, if using an ordinary ruler, should the object being measured be aligned with the center of the lines, or one of the edges? All these years, I've simply tried to maintain consistency. Given the potential for varying thicknesses of lines on gauges, etc., I'd assume that measurements should be taken from the middle lines. Now that I'm hoping to measure with precision, this is something I really should know for sure.

Second, I have a pair of calipers. I don't know how to find the thickness of things with precision. For instance, if I clamp a pen between the the jaws, I can take a measurement of the diameter. However, if I slowly increase the distance between the jaws, I can see that the digital readout is indicating an increase in distance, yet the pen will still held between the jaws. There's obviously compression at play, so I gradually open the jaws until the pen falls out. I assume that this is as precise a measurement as I'm going to get.

For objects that can't be picked up between the jaws, this becomes more problematic.

These instances aren't really important, they just illustrate a general problem. I'd guess I'd have the same sorts of problems when using highly precise micrometers, etc. So, what are the general guidelines? Is there a youtube video on this? Thanks.

.
scale line thickness if lines too thin they are hard too see lines so they are made slightly thicker than minimum to easily see the lines. scale is not high precision. a scale or ruler you would be lucky if it actually is within .002" per 12", cheaper ones often are off usually less than 1/2 a scale division so a 1/64" ruler could be .008" off per foot. i have measured many scales to get 2 the same length within .002" and if you have many different brands its easy to find some not that precise.
.
and of coarse as you hold a ruler or scale it gets bigger from the heat of your hand. a .001" change holding any gage like a inside micrometer is common from holding with your bare hands for long periods of time.
.
caliper and micrometer pressure of coarse needs to be controlled from both squeezing part and bending the caliper or micrometer bigger from excess pressure. obviously if you measure a gage block you can see by excess pressure how much it distorts readings. some bigger like 20" outside micrometers are very difficult to use (and repeat within .0001") as they often require higher pressure than a smaller 1" micrometer would need. bigger micrometers its extremely easy to get .0005" difference in readings from different pressure and not being on the part perpendicular
.
indicating bore gages use just indicator spring pressure and usually need a method to help hold gage centered on bore and you rock back and forth to get biggest reading (centered on bore) and smallest reading perpendicular on bore. hard to describe. the heat of holding gage with bare hands usually you see reading change over .0001" fairly quickly. most wrap gage with a rag to insulate it from heat of your hands
.
all gages have their limits. you do not measure to .001" with a 6" scale. sure with a magnifier you can read to .010" and guess to .005" but nobody going to accept 6" scale readings to .001" or .002"
 
if you want to read to .0001" most use gage blocks or ring gages for comparison.
.
if i got a bore to measure. i get closest ring gage at same temperature of part if ring gage is
10.00003" i measure the ring gage with inside micrometer if it says its 10.00053 than the inside micrometer is reading .0005" more than actual and i compensate measured part reading and after go back and measure ring gage cause as i hold it the heat of my hands if causing the micrometer to grow bigger and i would need to compensate a different value
.
if i use a indicating bore gage i would use the ring gage and do same. set to ring gage, measure part and then check ring gage. usually can get repeatability to .00005"
.
all micrometers inside and outside and other gages like indicating gages i check against gages blocks compensating for gage block error amounts and have the gage blocks or ring gages usually in the machine, same as part temperature.
.
it is extremely easy to get .0003" or more difference in readings cause part is not the same temperature as the air temperature in the shop. coolant temperature even when its a 2000 gallon system temperature controlled to +/-1 degree F you still can get evaporation cooling, or coolant gets cold spraying in the air and cause its evaporating its causes part to cool off, very common on big part if coolant only touching one end of part for that end of the part to have gotten taller or shorter cause the coolant on the one end of the part caused it to change size.
.
temperature control is the primary reason different operators might get different readings even if using gage blocks or ring gages they have to watch temperature differences
.
other thing is operators measuring bore need to record minimum bore size. surface waviness of .0001" to .0003" and obviously you can get .0003" different readings. if you need to assemble parts usually you need minimum bore size and maximum outside diameter to make sure parts go together. operators need to agree on record only minimum bore and maximum outside dimensions. lack of agreement of how to interpret waviness can cause readings to vary considerably. obviously if there is taper to bore or smaller at one end of the bore you need to know that too so parts can assemble
.
same if you got out of round (in addition to waviness) you can put a shaft into a bore and turn part and it locks up tight as the out of round moves. thus you might get 2 parts weighing a ton together but you might need to rotate one part to line up the bolt holes and its very annoying if parts lock up tight. heavy parts stuck together not easy to take apart. needs to agree on how to record out of round as well as waviness. most times parts need to assemble, operators need to understand that
.
parts often distort after unchucked and after time goes by. that is a 10" bore might be round to .0001 and as soon as clamping bolts loosened part warps distorts and bore might be out of round .0010" or 10 times more out of round than when bored. many parts distort with temperature changes and or after days and weeks go by.
.
many reasons why measurements can appear to change. usually there is a reason for variability. i have seen a 100 parts and they all warped not flat .002 to .004" per 40" unchucked. not a question if they would warp only a question of how much they will warp. many a warm part just machined will obviously measure different as part temperature changes. many a lathe part not only will change size from temperature change it will be out of round unchucked cause 3 jaw chuck released now part partial triangle shape.
.
same as a bore checked with plug gage on a mill and when mill vise loosened now bore out of round and plug gage wont go in cause vise was squeezing part.
 
Four measurements were taken with a tenth micrometer by two persons

Value 1.002 absolute
Person A Person B
1.0024 1.0020
1.0027 1.0021
1.0026 1.0019
1.0018 1.0019

In the case if person A is the value reported as 1.002 or 1.003? In the case person B the value reported is 1.002.

Tom

Either you don't own a micrometer with a ratchet or you don't use it. If you use the ratchet (as was intended by the manufacturer) the biggest difference you'll get is 0.0001". Get more and you're doing something wrong.

Ever watched micrometer calibration?

I'm waiting for MatiJ t pitch in :)
 
Either you don't own a micrometer with a ratchet or you don't use it. If you use the ratchet (as was intended by the manufacturer) the biggest difference you'll get is 0.0001". Get more and you're doing something wrong.

Ever watched micrometer calibration?

I'm waiting for MatiJ t pitch in :)

This was strictly hypothetical

Tom
 
Either you don't own a micrometer with a ratchet or you don't use it. If you use the ratchet (as was intended by the manufacturer) the biggest difference you'll get is 0.0001". Get more and you're doing something wrong.

Ever watched micrometer calibration?

I'm waiting for MatiJ t pitch in :)

I can make my rachet mic close a few extra thousands by turning lower on the barrel...keeps me in tolerance most of the time. :)
 
Either you don't own a micrometer with a ratchet or you don't use it. If you use the ratchet (as was intended by the manufacturer) the biggest difference you'll get is 0.0001". Get more and you're doing something wrong.

Ever watched micrometer calibration?

I'm waiting for MatiJ t pitch in :)

Bullshit.
Ever done MSA across a shop?
No way on God's green earth you end up within a tenth on a micrometer and I have thousands of taken data points in the my MSA records which we have kept since the early 80s.
What do you six-sigma a non digital micrometer at?
................(for others MSA is Measurement System Anaylsis, the accepted way you define how much you trust such stuff, six sigma is the trust band or guess to real)
No reasonable quality engineer or manger would make such a claim. You embarrass yourself and your profession.
45 years and more into the sub-micron measuring world than most in I'm quite capable with a micrometer. I can't do a tenth that can be trusted.
Bob
 
Bullshit.
Ever done MSA across a shop?
No way on God's green earth you end up within a tenth on a micrometer and I have thousands of taken data points in the my MSA records which we have kept since the early 80s.
What do you six-sigma a non digital micrometer at?
................(for others MSA is Measurement System Anaylsis, the accepted way you define how much you trust such stuff, six sigma is the trust band or guess to real)
No reasonable quality engineer or manger would make such a claim. You embarrass yourself and your profession.
45 years and more into the sub-micron measuring world than most in I'm quite capable with a micrometer. I can't do a tenth that can be trusted.
Bob

You omitted to inform as to the point I raised. Did you or didn't you use micrometers with a ratchet to obtain the measurement results?

When "touchy feely" is used then of course human influence will influence the results. I've probably seen one but I can't remember the last time I saw one without a ratchet. A difference too might be that micrometers here are company property and I've never seen anyone use their own.

Of course when I and others use micrometers here we measure in mm. When I wrote that the result could change (here it'd be by 0.001mm) then it would be because the dimension was close to either the one dimension or the other. I wouldn't dream of not using the ratchet.

To the OP. The result the micrometer shows doesn't mean that the result obtained is the factual dimension. Even the best micrometer has an accuracy tolerance.

Look at the measurement results I obtained (using a micrometer and measuring each 3 times) 15 seconds into the video and only one result changes by 0.001mm.

Digital Caliper Accuracy - YouTube

No disrespect but I'm sure your experience is from several decades ago.

Been a while but I taught for years how to calculate sigma. Unless distribution in the measurements are an exact bell curve (have you ever seen that?) then the results are theoretical. I've seen some calculate when the measurement results were obviously from an uneven distribution (machining against a stop will give that result) as if it was a normal distribution. Also calculating normal distribution from what were obviously unconnected batches.
 
With measuring the likes of a pen consider the pen can distort form pressure and the measuring device can also bend and distort with excessive pressure, agree a clutch drive in a micrometer will/may take away the chance of stressing that device, perhaps both the pen and the measuring tool.

The best feel to a part is a slight rub-friction on the part so one can feel the part but not lock up solid-tight on it. that is a talent to develop with practice. Best to take a known size part like a jo block and measure it a number of times to develop the talent. You teach an apprentice to feather feel the thimble of a micrometer with such as a jo block as the thing to feel. Getting the right number even with one's eyes closed is the talent to develop.

Falling out of the measuring device likely is too lose by a ten thousandths and holding in check/tight likely is too tight by a ten thousandth, so that makes you check/measure perhaps a plus or minus at a two tenth spread or more.

I taught my Boy Scout troop how to use a micrometer. Boys aged 12 to 18 got it, yet one adult leader could not get it because he would not follow instruction or perhaps could not feel the slight friction on the part.

I agree a non-clutch micrometer and a digital caliper are not one tenth devices. Perhaps one or two tenths with a very good condition one and a practiced feel, on a good day (but not close enough to ship a part close.)

Consider how much practice does it take to make a long basketball shot, every skill takes practice.

DMF mentioned temperature..that and the part may not be parallel sided or exactly round so just or one-check or not to think about the variables is another problem.
 
Last edited:
Find an optical comparator to measure the pen if you need a very accurate measure. If it can compress, calipers/mics won't give very good info.

There's a thing called "Gauge R and R" which measures how much the person affects the measuring process vs. how much the measuring device does. I'm not sure the clutch on a micrometer makes the human factor zero, but probably helps a lot.
 
I'm not sure the clutch on a micrometer makes the human factor zero, but probably helps a lot.

I dislike "zero" as much as "100%". The ratchet (I've never heard "clutch" used) force on each micrometer isn't identical but will give the same force on each measurement with its own ratchet. What the ratchet does is as good as eliminates human influence.

OTOH never forget the wisdom of Murphy. And to the OP :-

You can't buy a yard of elastic in Scotland. It keeps breaking when it gets measured.
 
You omitted to inform as to the point I raised. Did you or didn't you use micrometers with a ratchet to obtain the measurement results?
.....
No disrespect but I'm sure your experience is from several decades ago.
....

Of course we use the ratchet of spring drag thimble.
Not so sure about the several decades back. Data goes back further than that but the latest new employee micrometer MSA test was 3 weeks ago.

And yes with enough data points it is a nice bell curve so I'm not sure your point here.
Each person will have there own curve if the study is indeed blind. You absolutely can't do this with gage blocks or pins labeled for the correct size and any R&R doing so is flawed.
Each will be shifted one way or the other but enough people and you fill in the holes.
Bob
 
I dislike "zero" as much as "100%". The ratchet (I've never heard "clutch" used) force on each micrometer isn't identical but will give the same force on each measurement with its own ratchet. What the ratchet does is as good as eliminates human influence.

I suppose human A could wind it up slow vs. fast and get different readings. If we're talking 10ths, I don't think you can say anything 100% eliminates human influence. That would be assuming human input to ratchet (slip clutch jobber thingy) is a constant. Splitting hairs..
 
I suppose human A could wind it up slow vs. fast and get different readings. If we're talking 10ths, I don't think you can say anything 100% eliminates human influence. That would be assuming human input to ratchet (slip clutch jobber thingy) is a constant. Splitting hairs..

It looks as if you are putting words into my mouth that were never there.

"I dislike "zero" as much as "100%"."

If it is important never measure just once.
 
QT[The ratchet (I've never heard "clutch" ]
Agree I just use that because I have used/had micrometers that use other than a ratchet for torque-limiting so did not wish to use the name ratchet so to suggest that was the only such device, sorry.

Perhaps I should have said torque-limiting device.

Many of us have used our finger slip feel for the very same action and got to the ten thousndths line for close measure but not close enough to ship a part where the customer will have more accurate devices..and later used the built in friction device feel to the part.

Micrometer - Wikipedia
Micrometer - Wikipedia
Jump to Torque repeatability via torque-limiting ratchets or sleeves - A micrometer reading is not accurate if the thimble is over- or under-torqued. A useful feature of many micrometers is the inclusion of a torque-limiting device on the thimble—either a spring-loaded ratchet or a friction sleeve.

For an ordinary ruler go Same to Same... to the center of lines, right edge of line, or left edge of line..
 
QT[The ratchet (I've never heard "clutch" ]
Agree I just use that because I have used/had micrometers that use other than a ratchet for torque-limiting so did not wish to use the name ratchet so to suggest that was the only such device, sorry.

Perhaps I should have said torque-limiting device.

Many of us have used our finger slip feel for the very same action and got to the ten thousndths line for close measure but not close enough to ship a part where the customer will have more accurate devices..and later used the built in friction device feel to the part.

Micrometer - Wikipedia
Micrometer - Wikipedia
Jump to Torque repeatability via torque-limiting ratchets or sleeves - A micrometer reading is not accurate if the thimble is over- or under-torqued. A useful feature of many micrometers is the inclusion of a torque-limiting device on the thimble—either a spring-loaded ratchet or a friction sleeve.

About all I can write is that we have very different experiences re micrometer measuring in the machining world.

"Perhaps I should have said torque-limiting device."

The name isn't important as long as you know what I mean by "ratchet". It doesn't limit torque. It gives the same torque (force) each time when used as it should be used. Slow is OK with a few clicks. Fast isn't.

"A micrometer reading is not accurate if the thimble is over- or under-torqued. A useful feature of many micrometers is the inclusion of a torque-limiting device on the thimble—either a spring-loaded ratchet or a friction sleeve."

I remember seeing and using micrometers without a ratchet but that's got to be at least 40 years ago. I'm not sure what you mean by "over- or under-torqued" as the force when "zeroed" on a known distance (0-1" simply by "closing" the anvil and rod) and continued use of the ratchet will give as accurate a measurement as the micrometer accuracy allows.

There is a manufacturing standard for micrometers and it even takes into consideration of a possible deformation when the micrometer is used with a specified force.

Of course it is possible to get two different measurement results using two different micrometers but that is due to the possible micrometer inaccuracy.

I've seen manufacturers and calibration labs calibrate micrometers and always with the ratchet. No two people have the same "touchy feel" and that difference is guaranteed to make a difference.

If anyone has made or calibrated micrometers and disagrees with me as to what I've written I'd love to hear it.
 
I guess I would say the ratchet limits the torque to be to the factory pre-set and exactly the same..and I think they are just fine. I think my 1 to 12" set is ratchet..yes haven't used them for a time.

Guess I am shame full but I still have micrometers with different kinds of torque-limiting devices and still have some with just old solid barrels..I think the one I left at the deer blind shop is an old no torque-limiting device one, useful because there nothing need be in ten thousandths (+-.002 is fine). The same one I used to teach Boy Scouts so they would know how to feel the proper method with or with out a device..and they all got it with closing their eyes and measuring a jo block to a half thow and better.. Boys 12 to 17 years old and my grandson at 18. I think it is an old Starrett 230 likely 30 years old.

*Guess I am trying to answer the OPs question.

Yes I made a 1" micrometer with even making the 40 tap in high school shop.

I even have indicator micrometers still in my tool box...and have used some of the highest grade measuring devices (before I retired)
But they do still make friction micrometers from the $40 Fowler to some over $200, perhaps because some old timers still like them because they don't make that darn clicking sound to wake up the boss.:willy_nilly:

Buy Micrometers - Free Shipping over $5 | Zoro.com

Buy Micrometers - Free Shipping over $5 | Zoro.com

for lathe work in the rain or spilling your beer.:cheers:
Mitutoyo Digital Micrometer, to 1", Waterproof 293-348-3 | Zoro.com
 








 
Back
Top