What's new
What's new

Hilger & Watts TA51-2 autocollimator

Dear Peter,

the best manual I have found so far is the one that Dennis uploaded here:

This does explain how to swap the eyepiece from the end to the top and vice versa. But it does not have any details about adjustment. Instead, it says (top of page 9): "All adjustments other than those previously mentioned, are initially pre-set and scaled by the makers and it is advisable not to alter them in any way." So there is no adjustment/calibration procedure given there.

Cheers,
Bruce
 
A few days ago a HW sled and lambda/10 front surface mirror arrived (thanks Greg!). The sled had originally been factory fitted with a 2" diameter 3/8" (9.5mm) thick front surface mirror, but that had been broken at some point in the past, so the sled had no mirror . The lambda/10 mirror from Edmund Optics is the correct diameter but 1/2" =12.7mm thick.

My original plan had been to bore the mounting hole 1/8" (3.2mm) deeper. But after looking at the mount, I decided that this would weaken it too much. There is currently about 6mm of meat by the finger grips, and I don't like the idea of reducing that to < 2.8mm.

So instead I made an extender ring of of random tool steel. Here are the parts:
IMG_8219-new.jpg

At the bottom right is the original mirror retainer. It contacts the mirror at three points opposite the springs and contacts the cast iron face near those same three points, and is relieved elsewhere. My spacer ring at bottom left is 3.3mm thick and agrees within about 1 micron at those three points near the holes. Then I have ground about 30 microns (0.0012") of relief on both sides away from the three contact points.

Here is a closeup of my spacer, with the stock retaining ring underneath it. The grinding is functional not pretty, I discovered that the coolant in my grinder had gone bad and spent most of my time replacing that, so didn't spend the time to make it look nice.

IMG_8220-new.jpg

(By the way, if someone here can send me a few BA6 oval headed screws at least 1/2" long, I would be very grateful!)

Here is the mirror sled assembled

IMG_8222-new.jpgIMG_8221-new.jpg

The tests went well. The images are much clearer than with my provisional mirror, that I think was not flat.
I've been experimenting with the use of an inexpensive USB microscope to observe the crosshairs.

IMG_8223-new.jpg
This works very well:

IMG_8224-new.jpg

I can repeat the 0.1 arcsec level without eyestrain!

Cheers,
Bruce
 
My model does not have the option to move the eyepiece and I found it a bear to keep viewing the cross-hairs when working, so I turned up a ring and glued that to an old phone case. I use a Motorola G7 power smartphone (which has phenomenal battery life) as a viewer. I just set the camera running in video mode and it works great.

View attachment 383375
View attachment 383376

View attachment 383377

Peter, does your aluminium adaptor ring fit on top of the eyepiece? Or in place of the eyepiece? Or does it have a lens inside? Cheers, Bruce
 
Excellent!
John, from reading other threads, I have the impression that you know a lot more about optics than I do. I have an optics question that you be able to help me with. The context: I am still experimenting with fitting a camera to the AC, and am not yet happy with the results.

If I view the reticule with my eye, looking through the eyepiece correctly, I see a uniform green background over the entire reticule (12mm diameter) with the target crosshairs and reticule clearly visible and in sharp focus over that entire field. If I move my eye away from the eyepiece, but keeping it on the axis of cylindrical symmetry, the appearance changes. I can still see the entire 12mm diameter reticule, but instead of a uniform green background, I instead see a small bright green spot in the middle, with a much dimmer annulus around it. If I move my eye closer to the eyepiece again, the bright spot grows until it fills the entire field of view, restoring the correct appearance.

The incorrect view (a small bright circle in the middle of the reticule) is also what I see if I remove the eyepiece, and mount a camera in its place, with the lens focused on the reticule and the focal length/magnification set so that the field of view is 12mm. Can you tell me why that is and/or how to fix it? I'd like to get the same view through the camera as I get with my eye correctly located on the eyepiece. But right now, what I get with the camera is equivalent to the view obtained by placing my eye at the wrong position along the cylindrical axis of the eyepiece.

Cheers,
Bruce
 
Last edited:
Good Evening (California time), Bruce --

I've used telescopes in doing my job since 1975, but I'm far from being an "optics guy", just a user. Still, I've used enough optical equipment over the years to suggest that you try 1) using the eyepiece lens as a collimating lens, and 2) set your camera focus to view the collimated image of the reticle being projected through the eyepiece lens.

This can, in theory, be done with widely varied separation between the camera and the acting-as-a-collimating-lens eyepiece lens. I suspect you would want to make an adapter to mechanically hold the two lenses coaxial, at near-minimum separation, and minimize stray light intrusion.

In focusing the eyepiece on the reticle, be sure to relax your eye muscles to the extent possible, and check for parallax error by moving your head laterally to verify that the reticle appears to be stationary.

Incidentally, I see a similarity between the eyepiece-and-camera situation and the eyepiece-to-eye situation, and that similarity might hint that the camera might show a sensitivity to the distance between it as the eyepiece somewhat like what you've visually observed.

Probably 30+ years ago, I experimented briefly with mounting a borrowed CCTV camera behind a line-of-sight telescope. The rudimentary proof-of-concept testing worked, albeit with a significant stray-light problem that I'm confident could have been eliminated with a bit of shielding. I couldn't get management blessing to buy a dedicated camera, and we quit using the fixed-reference 'scope before video camera prices went beneath the petty-cash threshold.

I do hope this rambling essay helps.

John
 
Peter, does your aluminium adaptor ring fit on top of the eyepiece? Or in place of the eyepiece? Or does it have a lens inside? Cheers, Bruce

Bruce, the adapter is just a ring that fits over the eye piece with grub screw to hold it. The AC is left intact and I found that the camera phone just 'sorted itself out'.
 
Bruce, the adapter is just a ring that fits over the eye piece with grub screw to hold it. The AC is left intact and I found that the camera phone just 'sorted itself out'.
Peter, thanks for the helpful reply. Can you zoom the camera phone in and out to see the entire reticule and then zoom in to see just the middle bit? Or does it just give you this view and that's it? Do you need to manually control the brightness and so on? Cheers, Bruce
 
Peter, please, could you make snapshots of "max zoom in" and "max zoom out" and then upload them here? They would be useful points of comparison for me.
 
Still, I've used enough optical equipment over the years to suggest that you try 1) using the eyepiece lens as a collimating lens, and 2) set your camera focus to view the collimated image of the reticle being projected through the eyepiece lens.
Hi John,

This paper does exactly what you suggest:

Jie Yuan and Xingwu Long, "CCD-area-based autocollimator for precision small-angle measurement", Review of Scientific Instruments 74, 1362 (2003), https://doi.org/10.1063/1.1539896

The authors put a camera behind the eyepiece, with a zoom lens having a focal length from 8 to 50mm. The eyepiece of their instrument has a focal length around 10mm, so this lets them shrink from full field of view to x6 magnification of the crosshair target. I'll try something similar: I've got a small USB camera with a 5-50mm zoom lens that I can use.

Cheers,
Bruce
 
Hi John,

I did what you suggested and it worked, but there is still one aspect that I am unsatisfied with.

My starting point is a USB camera with an OV9712 sensor (1280 x 800 pixels, 3x3 microns each). This has a 5-50mm focal length zoom lens (manual focus), which I set to focus at infinity. I opened the aperture and put it close to the eyepiece. Here are four resulting images.

At the shortest focal length:
one.jpg
My issue here is the following. The green circle does not extend out to the boundaries of the recticule. That boundary would be a circle with a diameter about 1.5 - 2x the diameter of the visible green circle. This is like the image I get if I look through the eyepiece, but with my eye too far away from the eyepiece.

To say it another way: what I *should* be seeing in the previous photo is a green background that covers almost the entire rectangular area, with the corners black, and a sharp circular dividing boundary between the black corners and green circular area.


If I zoom in by increasing the focal length I get this:

two.jpg

The issue is the same. If I look through the eyepiece, this entire rectangular region should be fully illuminated. Meaning: I should see a rectangular green background not a circular green background. There should be no black in the background.

If I increase the focal length more:

three.jpg

Now you can see that the green area is gradually filling the frame.

Finally, if I make the focal length close to maximum:

four.jpg

This is extremely nice to use. It's very easy to repeat 0.1 arcsec at this level of zoom. It's also nice that the entire field of view is a uniform green.

Do you have any idea how I could fix the issue that I am describing in the first photos?

Could it be that the autocollimator is actually forming two images? If I make the green light source bright, put a sheet of paper over the eyepiece, and move the paper up and down, I can see that the eyepiece brings the green light to a sharp focus at a point about 19mm above the top surface of the upper lens. If I put a cmos array at that focal point, which has about a 1mm diameter, I can see that this makes the AC act like a telescope, and gives a sharp view of the object at the location of the mirror (but not of the target or reticule).

If I have understood correctly, the eyepiece should be collimating the rays coming from the reticule/target, i.e. making them parallel as if those objects were at infinity. So the eyepiece is focusing one set of rays to a point, but making a different set of rays parallel. It is the latter ones that I want to image, not the former.

Could you or someone else here suggest a modification of my camera setup which would enlarge the green circle to full size, so that what I see is comparable to what I see with my eye? Is the angular field of view too small?

Cheers,
Bruce
 
Last edited:
Bruce I can't take example pics for you because I don't have the gear with me. Your problem with the green circle being small in the camera image though is probably affected by how close the camera is to the eyepiece. I can cause my phone mount to do the same by loosing the clamp screw and moving the mount in and out on the eyepiece.
 
Hi Peter,

I agree with what you have written: if I could move the camera closer to the eyepiece, then the green circle would be larger. However in the photos that I have taken, the camera lens is grazing the occular lens. So I can not move it any closer. Do you know if I can fix this by using a shorter focal length lens? If I understand correctly, this would provide less magnification but have a larger field of view. Or is there another solution?

Cheers,
Bruce
 
Bruce, Peter, and Company --

My quick thoughts are these:

1) the projection of the green circle onto a piece of paper held over the eyepiece is what those of us who used jig transits and alignment telescopes called "reticle projection". A convex lens system, such as your autocollimator eyepiece lens, can both "see" and create a "real image", and the image you are seeing on paper has been created by (projected from) the eyepiece lens.

Interestingly, the source image is partly a real image and a real object. The real image portion is the physical reticle, while the real image portion is, I would guess, a real image of the autocollimator projected image reflected by a mirror into the autocollimator telescope. That reflected projection-reticle image is then being cast into a real image in the plane of the real reticle.

2) I'm thinking that the green circle of light is simply the "pupil" of light passing into the autocollimator through its objective lens system, that the "hairs" of both the reference and projection reticles extend well beyond their illuminated areas.

I'm not having any quick thoughts about your autocollimator image to camera image concerns, buy I'll mull the situation over and let you know if I come up with any ideas.

Having said that, your work with your autocollimator has been really impressive!
 
This morning I found that there is a simple way to gain access to the focal plane. By removing four screws from underneath, one can easily take off the entire reticule assembly. This is probably also the simple way to swap the system from vertical to horizontal viewing, rather than by loosening the threaded rings as described in the manual. (Those rings are so tight that I can not turn them.)

IMG_8259-new.jpg

IMG_8258-new.jpg

This let me try something which I think is going to be by far the best solution: place the sensor directly on the focal plane. Here is a simple sensor, removed from its normal mounting box

IMG_8261-new.jpg


This is an OV9712 chip, that has a 1280 x 800 pixels, 3 microns square. So the total optical area is 3888 x 2430 microns. Just holding that at the prime focus by hand


IMG_8260-new.jpg

gives a very good image

concept.jpg-new.jpg

This is clear enough to easily analyze in software to derive exact position values.

This is what is called a "1/4 inch" sensor, and it's not large enough to cover the entire focal plane. The diagonal is 4.6mm in size. I'm going to see if I can find a larger sensor, say "1/2 inch" that should do a better job of covering the focal plane. Then I'll make a proper mount for it. I think I like this solution better than fussing with lenses, because the papers I have cited above show that it's quite easy to get resolution better than 1/10 arcsecond with no need to turn dials while staring at the screen.
 
Some of today's space-qualified star trackers are essentially cameras, having a convex lens system to project real images of star fields onto a photo array. Interestingly, the sensor is not at the lens system's focus, but is instead deliberately defocused.

By doing this, the images of the individual stars fall on a number of pixels. This enables analytically enhancing the precision of the individual-star centroid locations in sensor-array space, by considering not only the locations of the pixels, but also the degree of illumination.
 
Similar "dithering" methods are often employed in analog-to-digital conversion (ADC). There, a small amount of white noise, whose rms amplitude is comparable to the least significant bit (LSB), is added to the signal. Provided that the signal bandwidth is much lower than that of the noise and the ADC, averaging gives sub-LSB resolution.

In the first of the papers that I cite above, the pixel size corresponds to about an arcsecond, but the authors easily get 30 times that resolution. This is because the target line sits on at least N=1000 pixels. Using all of that information resolves the position to about sqrt(N) better than one pixel width. Projecting an image that uses half of the pixels, as in the star tracker, and using 2-d matched filtering, should give errors at the sqrt(N) level where N ~ 10^6 is the total number of pixels.
 








 
Back
Top