What's new
What's new

how much do you trust your best measuring tool?

proFeign

Cast Iron
Joined
Oct 2, 2007
Location
Santa Barbara, CA
I'm talking here about micrometers and calipers and indicators and height gages...

If you have a quantum tunneling unobtanium-enriched laser supermicrometer that is accurate to angstroms that is interesting too but I'm specifically wondering about small tools and what people with way more experience than I do think their particular micrometer is accurate to in the real world. When machining I think my good mitutoyo 8" digital calipers are good to 2 thou overall and maybe 1 thou if I'm very careful. My digital mics are mitutoyo ip65 models and they're definitely good on hard materials to 3 tenths and a little better if I'm careful to clean the faces and part of dirt and oil. They'll read to within 0.00005 of size on gage blocks if I spend time carefully wringing the block to the anvil but parts are seldom this straightforward to measure in the shop.

The tenths federal indicator at work seems to do its job but is really finicky, meaning that even if the indicator were perfect the setup is unlikely to be and I trust it to between one tenth and 3 tenths depending on the setup...

Please state make/model of your tools if relevant...

I am eager to hear if there is a concensus or if companies have a policy like accuracy is only as high as 3x the tool accuracy...?
 
The enviroment becomes as, if not more important than the graduations on the tool at or near the accuracy you are describing (temp, humidity), and then there is the operator.
 
I have a Mititoyo electronic digital mic I bought 20+ yrs ago. I can check it on deltronic pins and inspection grade gage blocks and it always reads exactly what the standard reads. I trust it a lot....

On the dicey work I like my inspection tools moving as little as possible so I usually have my mean dimension stacked in gage blocks. The tighter the tolerance is the cleaner and more consistent you'd better be.

If all I could guarantee with a digital Mic is .0003 I'd toss it and get a new one.....
 
It is what it is.

To me, with the rough & ready work we do, the dial calipers are fine. If I have ro hit +/- .005 they are ok. I use it to quicky scribe lines on first articles and check my work.

+/- .001 of half a thou, depth mikes, od mikes, etc., all calibrated, are what I grab.

Thats about it. if you are trying to hit tenths, you're going to have to find something better.

sometimes a tape is all you need

all the best
 
All of my measuring equipment is either Heilos or Mahr, and calibrated regularly. I trust the millimess indicator to a couple of microns (graduated in 1 micron increments). The rest of the indicators are good for 5 microns or so. The calipers (heilos), I'll trust to about 0.1 mm. If I need to measure any more accurate than that, I'll take it to the inspection dept and measure it with a Ziess, MMQ or some of the other super-precision stuff they have up there.
 
On the dicey work I like my inspection tools moving as little as possible so I usually have my mean dimension stacked in gage blocks.

Super good advice if you want accurate measuring. Just make sure the tolerances don't add up wrong.

Off the shelf Dial indicators meet standards if +/- one graduation. By design, test indicators are best for comparing not measuring. I only trust either for measuring when used with gage blocks.

I measure big, shallow c'bores on motor adapters up to 18" to .001" with Starrett Master calipers. Mics with vernier to .0001". Either way you have to know your tools and compare to a standard close to the measuring size.

I prefer Starrett micrometers and calipers but like the Federal and CDI dial indicators. Swiss test indicators can't be beat. My old Lufkin tools are still pretty accurate as are my old B&S tools. Too bad the later B&S tools are junk. Mitutoyo makes a decent line of tools but the Mitutoyo storage boxes are terrible. My Mitutoyo tools are my working tools for most average jobs. I have yet to see any good Chinese tools.

Noted by kustomizer the temp and humidity will nail you if not right.

Walter A.
 
I'm kind of with scrapwood, depends on what I'm doing. Some may think I'm nuts but I trust my dial calipers to about + or - .0002, they are old Central Tools ones from Montgomery Ward I bought in the 70's. I also use some new Fowler economy grade for my bang around calipers, they are very fragile and feel pretty rough but they do agree with my other calipers and mics, of course on stuff I want better than .001 or so I use mics and use the opportunity to cross check with calipers, so far they've alway been right. I'm not into digital, I guess just like to see the needle a hair over or under or what ever. I've been using digital electrical test equipment for years but I still like to see the needle swing on my old Simpson multimeter for a few things.
 
Good advice so far. If you have a measurement whose degree of precision is close to that of the available equipment you almost always need to have some means of independent reference. Then you can use your every day measureing equipment as a comparator instead of for direct measurement.

Since import gage blocks are so cheap these days it's no great burden to purchase a set to have on hand provided your work load warrants.

Ideally one sets up a stack of gage blocks and uses them to verify a reading on the measureing equipment, (mike, bore gage, bench comparator, indicating apparatus of whatever type). Then the measurement is taken and the error determined . Finally the equipment is re-checked with the gage block stack as a means of proving the equipment before and after. This is advanced technique intended to verify nothing went wrong between referencing and actual measurement.

Naturally great care is taken so the heat of handling and the presence of warm body radiance doesn't affect any part of the apparatus, the gaging stack, or the work piece. Also a thermometer is present so that observed readings may be corrected for temperature and coefficient of linear expansion.

This comparative technique is probably too fancy for most home shop applications. However it's wise to know how things have to be done of highly accurate measurements taken in a manner comparable with routine shop metrology - in spirit if not in full technical compliance.

Anyone who has fitted wrist pins to pistons and rods has bumped against the limits of routine shop accuracy. Here clearances are henld in the low tenths with pin and piston and the interferance is held to 0.0005" in the pin and small end of the rod. Sunnen has built fine apparatus that makes fit determinations nearly painless but there is always the temperature trap. How many have fitted aluminum pistons to pins on a warm day, assembled them on the rods and stored them for a time. Then on a cool day prepped them for installation. Son of a gun!! The pistons are stiff on the rods. Welcome to differential thermal expansion land. Warm the pisons up 20 degrees and they have the more clearance. Anytime you work tolerances closer than one part in a thousand (regardless of measurement system) you are in the range where temperature plays an increasily important part.

I never trust my measurement tools on anything important. I always cross check with the standard or a gage block stack.

By the way, anyone owing or lusting for a gage block set really should track down a gage block accessory kit - in the interest of completeness. They're hard to find but they do greatly extend the utility of the gage block set. This link take you to only one kind intended for rectangular blocks but many clever sets are available.

http://www.pmargage.com/koba/koba_gage_blocks/koba_gb_access.htm

I prefer the Mitutoyo.

http://www.jwdonchin.com/Mitutoyo/Catalog/GageBlocks.pdf#page=13&view=Fit


The Starrett 1" square block combination set has the accessories included but they are fearsomely expensive.

http://cgi.ebay.com/Starrett-846-W-...ZWD1VQQ_trksidZp1638.m118.l1247QQcmdZViewItem
 
Agreed on stacking gage blocks but I have yet to need accuracy to this degree. I designed a part last month for a radiation detector that had to fit in a very very tight area and use an off-the-shelf spring to hold it in place, which further tightened my tolerances since they didn't have the spring size I would have chosen if I got to make it up. I cleaned my mic faces really well and checked it with the closest gage block I had, which was a few thou smaller, and then on the next one above it and it was dead on so I was confident enough in between by a thousandth.

It fit just fine so that was good and I toleranced the drawing such that it used a standard size number drill. However in the end I have to admit that Kustomizer is right and I am maybe a little optimistic with my tools; I turned a part today out of aluminum and I took several .200 passes at a pretty slow 275 rpm and keeping it well cooled with soluble oil (steady stream the whole time) and it was easily .001 smaller after it cooled off (only warm to the touch) and I didn't account for this and it's a little undersize. Still fine for its function but it is a very good lesson for me.

I usually oversize parts somewhat for finishing but I've never heated a part up enough that it got this sloppy. It was a friction fit and it'll work into its mate just fine after a few uses since it's held with four setscrews anyway which will bite it pretty good but it was an eye opener wrt temperature for me. All the parts I make are in service at room temp so it's never a problem to find a temp-controlled lab because if my calipers or mic are warm and the part is at its service temperature there is little to worry me.

All that said I do trust my mic to .0001 when measuring small differences in thin parts (specifically lately I have measured a lot of "thick film" (.001-.0075) plastics and they all mic out to about what they should be, pretty much (maybe 50 millionths either way) but if I measure the same places twice I always get exactly the same thing. I use the .0001 indicator for the same thing on a freshly cleaned surface plate and it does the same, but if I were checking runout or something that indicator wouldn't be useful for measurement anyway. This is the only time I've really used an indicator as a measuring tool and it works great for covering a large area on these plastic sheets that a mic can't reach. Plus it's way faster. Also I'm not QC so these are just for reference purposes.

Anyway I think for controlled conditions my mic is good to .00015 and the caliper to .001 but i can't really machine parts to .00015 anyway so it's nice to have the overhead.

And a tape measure and a pocket scale are at least as helpful lately to me.

I learned on this forum that carpenters sometimes say "11 and a strong 5/8" to indicate that the measurement is somewhere above 5/8 but probably below 11/16... Interesting.

Also I got a fractions-reading Fowler Poly-Cal to leave in my car for whatever and to see how a fractions reading caliper works and the polycarb frame will flex enough (easily) to show .007 when the jaws are closed.

OTOH my $18 Enco digicals hold .001 all day but I don't expect them to live very long and the steel and fit and finish are garbage compared to my mitutoyos. the best fit and finish i've seen though is an old B&S micrometer that I got on eBay for $15 that has the most beautiful half-brushed half-polished stainless finish i've ever seen on any tool. i cleaned it up and lubed it and it shames all other mics i've ever used. i like verniers because nobody else I know can figure out how to read them. Talk about user error!

On a related note: what is appropriate measuring force? I'm sure there are as many forces as operators, I think forrest addy once said one man's light touch is another man's hardest turn.

I cleaned out all my friction thimbles to loosen them up to very low force and (this lets me consistently measure softer materials) and I'll tweak the solid thimble manually when it's closed on the part if need be on harder stuff.

--Try with your friction/ratchet device: does it hurt to mash your thumbnail in the mic? This won't work with a solid thimble; a thumb is too squishy to get a good feel, but my mics are set to a pretty loose slide so it'll mash my thumb but not enough to be painful; just marginally uncomfortable. If it was a solid thimble I'd keep on going because a thumb just doesn't feel right... I don't know of a better way without force gaging to compare... "not quite hurty" would be my way of characterizing it.
 
As the punishment has to fit the crime so does the precision have to fit the class of work

A the Executioner sang:

A more humane Mikado never
Did in Japan exist,
To nobody second,
I'm certainly reckoned
A true philanthropist.
It is my very humane endeavour
To make, to some extent,
Each evil liver
A running river
Of harmless merriment.

My object all sublime
I shall achieve in time —
To let the punishment fit the crime —
The punishment fit the crime;
And make each prisoner pent
Unwillingly represent
A source of innocent merriment!
Of innocent merriment!

All prosy dull society sinners,
Who chatter and bleat and bore,
Are sent to hear sermons
From mystical Germans
Who preach from ten till four.
The amateur tenor, whose vocal villainies
All desire to shirk,
Shall, during off-hours,
Exhibit his powers
To Madame Tussaud's waxwork.

The lady who dyes a chemical yellow
Or stains her grey hair puce,
Or pinches her figure,
Is painted with vigour
And permanent walnut juice.
The idiot who, in railway carriages,
Scribbles on window-panes,
We only suffer
To ride on a buffer
In Parliamentary trains.

My object all sublime
I shall achieve in time —
To let the punishment fit the crime —
The punishment fit the crime;
And make each prisoner pent
Unwillingly represent
A source of innocent merriment!
Of innocent merriment!

Chorus:
His object all sublime
He will achieve in time —
To let the punishment fit the crime —
The punishment fit the crime;
And make each prisoner pent
Unwillingly represent
A source of innocent merriment!
Of innocent merriment!

Mikado:
The advertising quack who wearies
With tales of countless cures,
His teeth, I've enacted,
Shall all be extracted
By terrified amateurs.
The music-hall singer attends a series
Of masses and fugues and "ops"
By Bach, interwoven
With Spohr and Beethoven,
At classical Monday Pops.

The billiard sharp who any one catches,
His doom's extremely hard —
He's made to dwell —
In a dungeon cell
On a spot that's always barred.
And there he plays extravagant matches
In fitless finger-stalls
On a cloth untrue
With a twisted cue
And elliptical billiard balls!

My object all sublime
I shall achieve in time —
To let the punishment fit the crime —
The punishment fit the crime;
And make each prisoner pent
Unwillingly represent
A source of innocent merriment!
Of innocent merriment!

Chorus:
His object all sublime
He will achieve in time —
To let the punishment fit the crime —
The punishment fit the crime;
And make each prisoner pent
Unwillingly represent
A source of innocent merriment!
Of innocent merriment!

The same goes for the resources, the atitude, the tolerances, and the degree of precision. I measure precision machine shop fits in terms of thousandths, framing carpentry in sixteenths, and cement into concrete by the shovel full.

.
 
Measuring things :)

Hmm, how much do I trust my best measuring tool? If I am working from a print, it is very helpful to know what will be used in QC to verify a size. For instance, it has been my experience that a hole checked with a plug gage will be allowed to be larger than a hole that is measured with a micrometer.

This may be old hat for some, but for anyone who hasn't, try this little exercise. Take a ring gage of "known" and "reliable" size that you can use to calibrate your inside micrometer, hopefully one that you don't know what the size is. And then use a piece of tape to obscure the size. Use your normal procedure and measure the ring gage with the inside mic. Now use your calibrated outside micrometer to measure the inside micrometer, try to do this without looking at the thimble of the outside micrometer. Then if you have access to one, put the ring gage on a cmm and see what you get. Then if you are reallllly lucky, see if you can find a plug gage that will fit in the ring gage. Now peel off that piece of tape from the ring gage and then compare all those (different??) numbers. Now which number will be used to determine if you get paid for making that hole?? :) Dave
 
"Some may think I'm nuts but I trust my dial calipers to about + or - .0002"

Yes....you are....



The other good practice is to check with two differnt tools. A caliper check will always tell you if you set up your bore gage wrong, or stacked your blocks wrong, etc. So, you do a rough sanity check (to check your check)then the accurate check to say where it really is. This will save a LOT of scrap.....
 
A little piece of historical perspective

When I was serving apprenticeship in the late 60's-early 70's, the old-timers running the shop had some interesting comments about measuring tools and tolerances. Realize, of course, that they were speaking of a different era-around 1930.

According to their stories, back in the 1930's the only people who had micrometers were the inspectors. The machinists were expected to make do with a 1/100-inch graduated scale and calipers-and not the vernier kind, either.

That being their background, they sometimes scoffed at my "overuse" of micrometers or such when doing non-critical jobs around the shop. Their thought process was that micrometers and such were rare and valuable things, not to be risked or worn unnecessarily if not mandatory. They had lived in the depression era--a different world.

I decided to see what kind of tolerance I could achieve with just a Starrett 1/100th reading scale and simple calipers. After some daily practice on various jobs, I found I could hit the desired dimension to plus or minus 0.003 inch reliably (verified with micrometers). This was on either milling or turning jobs. Of course a key to this was developing the feel for the calipers.

All this history was the source of a commonly heard phrase they used around the shop--"Scale it!" The boss would say that to tell me the level of precision for a particular dimension or job. It was just another way to keep the help from going overboard on tolerances, as we usually worked from sketches, not dimensioned drawings.

I agree that is worlds away from most of this thread. I just offer it as an example of what is possible when you are practiced and blessed with the excellent eyesight of youth. Now after having said this, somebody will likely say they can/could do better. Great, but that is not the point. The point is what is possible with some pretty crude measuring tools, good eyes (or, in old age, a good magnifier) and some practice. It also taught me something about avoiding over-tolerancing things.
 
If the print calls out +/- .003 I trust my verniers, if the print calls out +/- .001, I trust my mics (doesn't matter B&S, Starrett, or Mitutoyo, etc).....have they ever let me down? nope, I keep 'em in good shape, and repair/ replace when they exhibit wear. On those occasions where I NEEDED, FOR SURE +/- .0002 as a grinder hand, my mics AND my B&S digital verniers have always hit the dimension accurately (as verified by the QC dept)......but I only used them for a 'working measurement', when I needed to verify my test samples (as spec'd by QC, and the print, every 20th piece), it was the indicating mic (or indicating bore gage, or indicator on an inspection surface plate, optical comparater, etc, obviously depending on application) properly set up with inspection gage blocks, etc.
on edit: temperature variation always needs to accounted for, but that's something that comes with experience....I knew what dimension I needed to make the parts to, to meet tolerancing on the print FOR WHAT I WAS DOING as a grinder hand.....I would've been lost out on the big Toshiba CNC HBM trying to account for temp variants.....likewise, the guy who ran the Toshiba KNEW exaclty what he needed to do to make that 4,400# block of copper to spec on a 95 degree day, but he would've been lost standing in front of my 12-24 Okamoto grinder.........
 
When I was serving apprenticeship in the late 60's-early 70's, the old-timers running the shop had some interesting comments about measuring tools and tolerances. Realize, of course, that they were speaking of a different era-around 1930.

According to their stories, back in the 1930's the only people who had micrometers were the inspectors. The machinists were expected to make do with a 1/100-inch graduated scale and calipers-and not the vernier kind, either.

That being their background, they sometimes scoffed at my "overuse" of micrometers or such when doing non-critical jobs around the shop. Their thought process was that micrometers and such were rare and valuable things, not to be risked or worn unnecessarily if not mandatory. They had lived in the depression era--a different world.

I decided to see what kind of tolerance I could achieve with just a Starrett 1/100th reading scale and simple calipers. After some daily practice on various jobs, I found I could hit the desired dimension to plus or minus 0.003 inch reliably (verified with micrometers). This was on either milling or turning jobs. Of course a key to this was developing the feel for the calipers.

All this history was the source of a commonly heard phrase they used around the shop--"Scale it!" The boss would say that to tell me the level of precision for a particular dimension or job. It was just another way to keep the help from going overboard on tolerances, as we usually worked from sketches, not dimensioned drawings.

I agree that is worlds away from most of this thread. I just offer it as an example of what is possible when you are practiced and blessed with the excellent eyesight of youth. Now after having said this, somebody will likely say they can/could do better. Great, but that is not the point. The point is what is possible with some pretty crude measuring tools, good eyes (or, in old age, a good magnifier) and some practice. It also taught me something about avoiding over-tolerancing things.

Well said! You must have had better eyes than me. The best I could do repeatably was .007" although I did get lucky once in a while.

Up until retirement my spring calipers (mostly friction type) got a pretty good workout.

My grandfather was a machinist at Hyatt Bearing in the 20s. He owned and used all types of measuring tools including indicators and micrometers. The fellows you spoke of must have come from a little rougher shops.

My Great Grandfather was a machinist at Brooklyn Navy Yard near the turn of the century. His measuring tools were scales, spring calipers, folding rules and plumbs. As you could tell by the tools he was an outside machinist.

Walter A.
 
I trust most any instrument that removes as much of the human element as possible. I'm talking about parallax for vision, tension for feel. I like digital because no parallax is possible, although analog gaging does have its place. I like indicating micrometers and any instrument that has a constant force application, eliminating the feel factor. Checking a dimension with two instruments is good (at least on the first go) to eliminate or at least reduce the .025" off you might get on a micrometer reading. ALWAYS checking an instrument to a standard block/plug gage before measuring, regardless of the calibration cycle is one of the best ways to eliminate the possibility of error. I don't believe that periodic calibration is a reliable way of assuring quality, although many companies require certification of calibration of all measuring instruments as quality assurance (I do believe however, that the standards must be re-calibrated from time to time, and personnel must be trained in the use and care of those gages.
I also believe in the 10 rule, that if your tolerance is ±.005, you can use your calipers, ±.001, micrometers and tighter tolerances demand finer tools.
 
No one mentioned the need to check and recheck critical measurements in an attempt to remove human error. Standard practice in Metrology is to sample the feature 30 times and calculate the mean.

As I understand it dial calipers are reliable to 0.05mm, vernier calipers to 0.02mm, an outside micrometer to 0.01mm. It seems to me that many machinist contributors above place too much confidence in the accuracy of their measuring tools. That said I can well imagine that confidence would be reinforced over decades of experience and successful employment.

Oh! Nice singing voice Forrest! Base/ baritone and regular member of the local musical society. Tarant tarah. Tarant tarah.
 
The difference between trusting the tool and trusting that the part is as the tool measured...

I buy new measuring tools of quality names(mitutoyo for the most part) and check them to quality gage blocks(Webber). I trust these tools and the measurement they give me. Since I have the set of gage blocks my trust in my other tools increased. I have a hard time trusting other people's tools and I don't buy used measuring equipment for inspection for this reason. Although sometimes exceptions can be made, but I wouldn't buy a 40 year old mic to save $20.00...not that there's anything wrong with that, its just me, and the trust factor.

If I was to measure a part that is at 50degrees C, I'd trust the measurement of my tools, doesn't mean the part will be any good when it cools to 20C. There are indeed many other factors that add up, but if you can't trust the "Standard" then everything quickly goes downhill from there. A bit like someone who checks his mic on a gage block and deducts the gage block is out .002"... there is trust, just not in the right tool and that surprisingly happens a lot. On a few occasion I did hit .0005" ID tolerances with digital calipers in good shape when it was all we had to measure a detail with.
 
I trust my B&S dial caliper to .0005, my B&S and Starrett mics to .0001. But only in my hands, or the hands of someone I have experience with and also trust.

On the other hand, I have a Mitutoyo dial caliper I trust to .010.
 








 
Back
Top