What's new
What's new

Need some help with measuring fixture

lbhsbz

Cast Iron
Joined
Dec 11, 2009
Location
Long Beach CA
The company I work for deals with automotive brake parts. We've recently uncovered a process snafu at one of the factories that we deal with, and suspect parts might be coming in with excessive thickness variation. The boss came to me instead of engineering because he didn't want to spend $500k over the next year....he gave me the weekend and a few grand, Our max allowable thickness variation is 0.012mm. For reference, one of the wife's hairs is .067mm thick.

Here is the test fixture that I built....I had a different design, but the boss trumped it in favor of this. Runout is not of concern here, so the devise ignored that, and focuses on thickness variation only.

ImageUploadedByTapatalk1354408312.500267.jpg

My problem, that I'm trying to sell to the boss, is that we're trying to measure microns here...the ground surface finish is skewing the TV measurement, and this won't produce any reliable/repeatable results. I can't get the thing to repeat twice. He wants a bunch of warehouse gomers to be able to run this test.

I'm a home shop guy....I'm not sure of the verbiage to make the point to the fool (boss) that this won't work.

Help.
 
I think you need to really examine what this fixture is actually doing. If all you want to do is measure thickness variation, grab a micrometer with .0001" or equivalent micron resolution and just mic the parts. I don't have a good feel for the details of what you have constructed, because I can't see all the parts, but it looks like you may be overconstraining the measurement assembly. If the 3-jaw chuck has any runout, and there is runout on the brake disk, you need a floating measurement head (with very smooth and linear motion) that is also stiff enough to give you repeatable measurements. You appear to be suffering greatly in the stiffness category, for measuring in single-digit microns with confidence. If you can supply more details of what you have constructed, perhaps more info will be forthcoming from others.
 
I think you're making this harder than it really is. For thickness variation, place part on a surface plate, set indicator to correct dimension, you're done, else you're looking for out of parallel, check same way at same time.
If you need parallelism to mounting face, make a cylinder of appropriate height with faces ground parallel; set disk on that and indicate for parallel and height. All you'll need is a good indicator and surface plate, good set of gage blocks and a height gage.

On reflection, the tolerance mentioned, just under 0.0005, seems awfully close for a rotor hat.
 
We have possibly thousands of rotors to measure...so the mic thing is out.

If you can see in the picture, the indicator sits in an articulating c-frame...such that runout (induced by the chuck assembly) gets cancelled out. The problem I'm having is simply that he wants to set a few minimum wage morons on these things testing parts. A micron isn't very big. If you breathe on the thing wrong, the indicator notices. I've built the jig spring loaded so that any clearance basically doesn't come into play, but I'm of the opinion that when measuring to the units at hand. It's more than a 20 second process.
 
The main stand is 1" 400 series stainless, the horizontal arm, which is extended about 6", is 5/8 400 series stainless. The clamp is a chuck of 6061...it's pretty stiff.
 
I doubt the indicator you are using is up to the task, also as your pivot joint moves your indicator will tilt slightly giving you a larger reading.
As you said a micron is not big, repeatable squareness of the two measuring points will be important.

I would eliminate the joint and mount an indicator on top and one on the bottom using a sum/difference box to get your reading.
You will want a capability at .001 mm so I would use LVDTs both top and bottom with large radius carbide tips.
Sorry, this will not be "low buck" electronics.

Also I hope your temp in the room is very stable since you are using 6061 for the mount.
Bob
 
I'd start by seeing what sort of repeatability you get using something like a Mahr indicating mic (the ones with a micron indicator at the tail). The microfinish is just one of many variables, including heat from handling of the part and several noted above (possibly overconstraining the "floating" head, thermal expansion, the generally light construction, the clamping method, etc.).

It looks like you're relying upon indicator spring pressure to assure contact on both sides of the rotor. Just using a mic, I'd guess you see as much as .01mm from varying pressure on a real micrometer from "feels like it's touching both surfaces" to "firmly seated." It would almost surely have that sort of deflection with a somewhat flimsy aluminum frame, if sufficient pressure were applied to firmly seat the "anvil" and spindle (in your case the indicator). Anyhow, I think you need a firmer as well as repeatable clamping pressure.

Note also that just a bit of grease or dirt on a regular micrometer anvil can cause the reading to be .01mm or more off. Your parts and the measuring apparatus contact points have to be scrupulously clean to measure to the .001mm you'd really like.

A fixture that firmly clamps the rotor down on precision pads, and then measures only from the top, would have greater repeatability. Instead of trying to rotate the part to see the variation (which can side load the indicator and cause it to stick and bounce), I'd settle on measuring a set number of points. That is: set part in fixture, clamp, measure; rotate part, clamp measure, rotate part . . . Each measurement might take under 10 seconds or so; possibly recorded with an SPC output indicator.

You could also construct a fixture that permits continuously rotating the rotor, but it would have to have greater contact pressure and might cost 10x. You could try one of the rolling indicator points made by Starrett, but I doubt it will help that much.

If you're measuring many thousands of these, I'd also consider if they'd be easier to handle if the fixture were oriented 90 degrees (more like a regular lathe spindle) -- probably depends on how the rotors are packaged when you receive them.

This is really the sort of problem that is best dealt with at the source. If manual measurements indicate your supplier has a problem, you want to trace that problem back to its cause and control it there.
 
There is a spring which pre-loads the lower roller (spi...shanghai precision instruments lol) against the bottom surface of the rotor. The "C" frame thing floats with any runout that exists. The design and theory I think is sound....is we were measuring to .001". Past that, I think we're fucked as far as repeatable, reliable readings.


I'm on the same page as everyone..I think. This will not yield an accurate measurement. I did an experiment tonight...I held my fingers on the rotor for 30 seconds on a spot I marked...it grew 3 microns. This is stupid, the solution is expensive, I've talked to a few metrology specialists about it. The ideal situation is 2 indicators, one on top, one on the bottom, both feeding into a PC and displaying a graph. Boss man says it's too expensive....he wants devices just like the have at the factories in China....he's happy with the current rendition. I think it sucks, but so long as I get paid to build what the boss man wants...I suppose it matter much does it.
 
Here's how I would do it:

The whole system needs to be set up on a surface plate, probably dedicated to this function.

Get an electronic dial indicator that has a provision for raising the stem.
This commonly uses a shutter release cable from a camera; when pressed the stem goes up.
This would be mounted on a fixture such as you showed. I would use a material with very low thermal coefficient of expansion.

Make three supports of equal height from the same material as the fixture, with the upper end radiused.
A conical profile (large bottom small top) would be best so you have a wide base sitting on the surface plate.
Set these up on the surface plate, spaced so that they support the rotor at the desired measurement radius.

The supports can be loose on the plate initially, for preliminary evaluation, but should be firmly attached for actual use.
The same is true of the fixture holding the DI.

Since you're working with three points, you have a defined plane for your measurement.
The weight of the rotor will hold it firmly against the supports, so no clamping would be required.
The radiused profile of the support is preferable to flat or pointed profiles, as it provides repeatability without damaging the rotor.

Set the DI up on the fixture such that it's directly above one of the supports.
Using a gage block stack, calibrate the DI by setting its zero at the nominal dimension.
An alternate calibration standard would be an actual rotor that has been measured quite accurately and is dead on dimension.

Now you put a rotor in with the DI stem raised.
Release the stem to take a measurement. The reading is deviation from nominal. No calculations are required (i.e. moron mode).
If your DI has an SPC output you could log readings directly.
Raise the stem, remove the rotor and insert the next one. Repeat ad nauseum.

Not necessarily the best/fastest/cheapest/easiest approach, but it will do what you want with a high degree of repeatability.

- Leigh
 
Last edited:
I suggest using something like this. A "thickness quick measuring device" (that's how it is called in German).
This was just a quick google. I bet you'd find something similar in English. At least, you got a picture.

Nick
 
but it won't hold the kind of accuracy that the OP needs.

Well, I don't have one of them. Doesn't it hold the accuracy even with a 1/1000 dial indicator? Meaning: 1/100 indicator would not be good enough for the +/- 0.015 mm anyhow.

Anyhow, I wanted to make a suggestion for a setup, but what you described is what I wanted to say. So your suggestion is perfect. ;)


Nick
 
Why would a break rotor ever need a thickness tolerance of .0005? .005" seems tight. There must be a mistake somewhere.
 
Untrained warehouse people? Kinda sounds like you need to pick 1 or two of the more mechanicly inclined ones and train them to measure with that kind of precision. I think Forrest nailed it with the surface plate and indicator, kind of hard to top. A go/no go gage with the correct sized slots would be difficult to trust with that small of a spread.
Regards
 
What I don't understand is why you are measuring these ????

SOP around here is that one bad part found and your 1000 piece batch goes back to the supplier for 100% inspection.
It is up to them to provide documentation that all parts are good.
Life gets really bad for the vendor if the end user plant shuts down for a lack of parts as the vendor has to pay the assembly guys to sit on their butt waiting for good parts.
May God help you if the bad rotors actually got into finished cars and a batch here would most likely be much larger than 1000 pieces. :eek:

The pain with with this dimension is that neither side is a true plane so thickness checking requires at least 20 measurements and you have to worry about "scaring" the finish if you put much spring pressure on the gauge points.
The fact that a "crown" (taper from OD to ID) is allowed and some waviness is also allowed makes any one sided check impossible.

These are typical tolerances for a plant doing this work, no big deal for those who are experienced.

This is not your problem and you should not be eating the money for it.
You should let the vendor know that they are paying for the price of the gauges and the inspection time.

You have a decent grasp on the repeatability problem.
You will not be able to sort good/bad parts here without a 2-3 micron or better gauge R&R.
Bob
 
The company I work for deals with automotive brake parts. We've recently uncovered a process snafu at one of the factories that we deal with, and suspect parts might be coming in with excessive thickness variation. Our max allowable thickness variation is 0.012mm.


This issue should be taken up with the supplier. (I'm guessing ChinCom)
The fact that your boss wants both brake rotor friction surfaces to be within .012mm (.0004724") leads me to believe he is clueless.


Rex
 
Build your stand to hold the rotor in the proper vertical position and put an indicator on each side of the rotor.

The fact that your boss wants both brake rotor friction surfaces to be within .012mm (.0004724") leads me to believe he is clueless.

I work in the OEM automotive brake business too. We don't draw the prints, our customer's engineers do and we are held to having to produce parts to those prints within process capability. There is no recourse for an over-toleranced component. It seems sometimes like they do that on purpose. Rotor thickness variation sends pulses through the system to the brake pedal, can also cause squeal. Not a happy scenario to a new car buyer.

Untrained warehouse people?

Common practice in our business is to use inspection or "sorting" companies who pay low wages to people to weed out defective product. Tonight we have 4 different sorts going on looking for bad castings (defects) from the foundry. They supply the bodies, we have to provide the sort criteria and method. The odds of getting the same person back day after day is slim.
 
Why would a break rotor ever need a thickness tolerance of .0005? .005" seems tight. There must be a mistake somewhere.

You can feel 0.001" DTV in the pedal on most vehicles. We spec our parts at 0.0004" MAX DTV.

Anyway, I was a bit pissed over the weekend because I was working on these stupid fixtures cuz the boss needed 'em by monday...even though he only told me about them on Thursday afternoon...and I had to work Friday. As expected, the indicator in the picture above was a stupid idea...very difficult to read. We needed a needle to watch, so I went and found some German made SPI dial comparator gauges....which are better, but still suck because of the limited range, failure to control the offset of the rotor by the factory to within the range of my new dial gauge, and the fact that you have to get the gauge within range before you can dial in the needle with the adjustment knob....a process which seems to move the indicator enough to knock it back out of range. What do they say about piss poor planning??? LOL.

Anyway, I've settled on a normal drop indicator with .0001" resolution (I know, resolution is not accuracy, but the boss is happy with it) and fitted some 1/2" round flat tips instead of the wheel or ball type probe. The larger probe surface seems to do a much better job of ignoring the surface finish in determining the actual DTV. The surface finish we spec is 2 microns max..so that's 20% of the DTV....no wonder the small contact surface probe sucked.

Right now we're getting much more consistant and smooth indicator activity...the results using my fixture just about match out CMM readings and measurements taken with a mic...so we're close enough to establish good vs junk.... and setup is simple. So easy a cave man can do it....which is good, because I'm now told we'll be hiring some temps to perform the testing....not QC temps as mentioned in the previous post, but regular temps that are dumb as posts.


As far as why we won't simply send the whole batch back?...because that would make sense, and be simple....a much more difficult, costly, and time consuming solution is to do a 100% inspection seeing if we can find some good parts such that we don't lose any more sales...so we'll do the later.

I don't make the rules, I just do what I'm told and bitch about it on the interweb.
 
Last edited:








 
Back
Top