What's new
What's new

Pin Gages for measuring hole depths

silveroranges

Plastic
Joined
Aug 27, 2019
First let me start off for saying, I am still learning. If I don't use the right word or terminology please correct me. In college (first year) for mechanical engineering at the same time as being a machinist apprentice to pay the bills.

At my job we are using pin gages to measure depth of a drilled hole on a mass produced part (medical device). Recently the tolerances were changed from (inches) +- .010 to +-.002 for the depth of the hole. I am finding inconsistencies with the pin gages used to measure the part, specifically the break edge on the ends. According the specs used by the gage pin manufacture, ASME B89.1.5-1998-R2009, "Plug gage ends are not required to be perpendicular or controlled." The gage pins' diameter are in spec, but the tapers on the ends are different sizes and lengths, the distances from the small diameter of the taper to large diameter (of the break edge) have inconsistencies as much as .006, at least the ones I have measured on a MicroVu.

Due to the drill tip angle of the drill we use for the hole, (90 degree included angle), the plugs 'fall' in different depths depending on their break edge sizes.

Has anybody encountered this issue before? A fix would be to grind the faces of the pin perpendicular to the axis of the shaft, but then we are essentially relying on a uncertified tolerance, which defeats the whole purpose of buying gages with certificates for NIST.

Looking for some advice and others' thoughts before I go be a pain in the dick for the engineers (who I eventually will be working with, hopefully)
 
A simple solution would be to just make your own gage with whatever end profile you desire. As far as a drilled hole with +/-.002" depth tolerance goes, I can only roll my eyes. This seems about like the pinnacle of silliness to me.
 
pin gauges as you have are not to length, only to diameter.

you could grind your own lengths.


or you could probably contact one of the manufacturers for a precision ground length of the pin you need.

I would think that dropping a pin of the correct diameter and length might be tough to get out.
 
I should have added some more information, it is a counter bored hole (well sorta, it is drilled so it has an angle on the bottom), and the pins we are using to measure the depth is a 'go' pin, on a set of go-no go gage pins. The part is produced on a swiss lathe. Essentially you make sure the hole is correct diameter, then you roughly measure depth with the go pin and caliper. But with this new tolerance I am finding that different pins are giving me out of tolerance readings, while others are still in. And I agree, it is very silly since the hole depth doesn't really have any bearing on the part, but since we are using a step drill for the hole, the hole depth relates to another features tolerances. This was their answer to try and lock down the other features tolerance, which is a chamfer on the tip of the part.
 
Yes, medical work is goofy. Your depth will change as the corners of the drill wear.
What I used to do was drill slightly shallow, then go back with a flat bottom drill (or b-bar) to square the corner out.

You gage pin needs a positive shoulder to stop against or you will be chasing your tail all day.
 
Makes sense, but my medical customer would never go for it.
Like I said, medical is retarded, that’s why you can get so much money for it.
 
I should have added some more information, it is a counter bored hole (well sorta, it is drilled so it has an angle on the bottom), and the pins we are using to measure the depth is a 'go' pin, on a set of go-no go gage pins. The part is produced on a swiss lathe. Essentially you make sure the hole is correct diameter, then you roughly measure depth with the go pin and caliper. But with this new tolerance I am finding that different pins are giving me out of tolerance readings, while others are still in. And I agree, it is very silly since the hole depth doesn't really have any bearing on the part, but since we are using a step drill for the hole, the hole depth relates to another features tolerances. This was their answer to try and lock down the other features tolerance, which is a chamfer on the tip of the part.

Sounds like a lot of stuff going on with this. If you are working as a first year machinist apprentice, you probably have a supervisor, who should be an ally in discussing this issue with your quality manager and the engineering group. You are obviously correct about using standard pin gages as a depth measurement aid; they are not routinely certified for that use because of uncontrolled dimensional tolerance. It MAY be that people who dreamed this up are unaware of the pin gage tolerances (or lack thereof), and need to be informed. If I were in your shoes, I would simply work out the numbers based on all the min/max conditions to show that trying to measure this way to +/- .002 is going to be ridiculously inconsistent, and engage your supervisor. Doing this in a CAD program is probably best for demonstration purposes.
 
Exactly. How do you spec something if you can't measure it? And if they can measure it,
ask them how they're doing it...

I always laugh at the conversations with those measurements...this dimension is critical...how do we measure it? Lol just like when they dimension something to a theoretical point.
 
First let me start off for saying, I am still learning. If I don't use the right word or terminology please correct me. In college (first year) for mechanical engineering at the same time as being a machinist apprentice to pay the bills.

At my job we are using pin gages to measure depth of a drilled hole on a mass produced part (medical device). Recently the tolerances were changed from (inches) +- .010 to +-.002 for the depth of the hole. I am finding inconsistencies with the pin gages used to measure the part, specifically the break edge on the ends. According the specs used by the gage pin manufacture, ASME B89.1.5-1998-R2009, "Plug gage ends are not required to be perpendicular or controlled." The gage pins' diameter are in spec, but the tapers on the ends are different sizes and lengths, the distances from the small diameter of the taper to large diameter (of the break edge) have inconsistencies as much as .006, at least the ones I have measured on a MicroVu.

Due to the drill tip angle of the drill we use for the hole, (90 degree included angle), the plugs 'fall' in different depths depending on their break edge sizes.

Has anybody encountered this issue before? A fix would be to grind the faces of the pin perpendicular to the axis of the shaft, but then we are essentially relying on a uncertified tolerance, which defeats the whole purpose of buying gages with certificates for NIST.

Looking for some advice and others' thoughts before I go be a pain in the dick for the engineers (who I eventually will be working with, hopefully)

Grinding the end of the pin perpendicular is a good and simple idea. Even if your worried it's not "certified" you can still verify and calibrate it, seems like you have the equipment to do so. Another option is you could make your own "flush pin" with a step ground in at the low end tolerance. Again though you would have to check and certify that gage being used.
 
I always laugh at the conversations with those measurements...this dimension is critical...how do we measure it? Lol just like when they dimension something to a theoretical point.

I had one of those not too long ago. This may be a long one... About a 50" radius on the end of a 3.500" diameter pin. The 50" R was a two place decimal, so +/-.010" for this particular customer. No big deal, except that the shape of the spherical curve near the center of the radius doesn't change much at ALL with even a .020" R change when you're talking a 50" R. From one end of the tolerance to the other we're talking the spherical shape changing by tenths across the curve. Location of the end wasn't tight in this case but the shape of the spherical end was! The guy who quoted the job didn't understand that part.

I used the readouts and a button tool to generate the radius, and then it was to be chrome plated. I warned him about that affecting the radius too - that the chrome would build heavy near the corners and edges of the part and it might get bounced at inspection if he's not real careful. So they get it back and polish the chrome a bit and send it to the customer. It gets bounced. Radius is perfect in the middle but no good near the edges. Too flat, bulged out from the chrome. I tell him get it radiused, chrome plate it, then send it for grinding. He does that, now it's looking perfect. It gets sent to the customer.

I go with this time, to see what they're inspecting it with. The inspection guy boots up a 5 foot long Faro arm. I'm already skeptical but I keep quiet. They check the radius by building a point cloud. He punches a button and the machine pronounces it spherical within .0004" and within tolerance for the radius. Great. I ask him, "Just for giggles, do it again, would you?" So he does. This time it's out .0014". The next time .0023". Then it's back to .0002". The calculated radius has varied by something like .050" between all these. I ask, "What's the claimed accuracy on your arm there?" He says "Oh it's pretty good, something like +/-.001"."

*Forehead slap.*
 
Years ago we modified a depth mic by cutting down the rod drilling and reaming a hole and silver soldering in a pin ground square on the end . Then we press in a drill bushing to the base indicate square and ground flat. If done right just as accurate and easy to use as any depth mic. We made three .029 dia ,.059 dia , and .090 dia .Much more accurate then measuring over pins and no math errors also you can calibrate them .
Pete
 
I used a smaller FARO arm setup to do plastics that were semi- to very reflective... we had several lighting rigs to reduce the contamination, but there were cases where all we could do was spray the parts with foot powder and retest until we got measurements that were within the "over" by more than our best guess. Presently, I work at a place that does work for military hardware contractors, and all of their parts are massively overcontrolled in the blueprints. How overcontrolled? We have TP called-out to +/-0.003mm for an unthreaded screw-hole to mount a plastic ring on the inside that keeps wire bundles together/separate. That hole could be off-location by 0.500mm and it wouldn't affect fit form or function. Madness.
 
When the tolerance was plus or minus .010 I would have used a caliper set to zero on a pin gauge and then use the step function of the caliper to measure directly how much of the pin is sticking out. Need a different plan for plus or minus .002". Like the idea of finding out how the customer does it then doing the same.

Got a sneaky feeling they do not even check it and they are relying on your jumping through hoops to make it closer than necessary so they do not need to check.
 
We use some molding compounds at work for measuring ID work. One we are getting is Struers. We had a demo last week. We were impressed. We looked into to their products for measuring surface finish. We did a sample on a surface finish standard plate. I was off 0.06 microns on an inferometer. We are looking at using it for other ID features.
 
..using pin gages to measure depth of a drilled hole on a mass produced part ...

Get a gauge made for the job and copies for whoever is making the part if not in house. +/-.005 is a PIA with pins as the edge break is not controlled. You could change the print to drop a standard size ball bearing down the hole and measure to the top of it. That would be easily measured. It might not be measuring exactly what you want though.
 








 
Back
Top