Equipment

OLAF’s Lens Art

Published April 18, 2014

This is a Geek Article, with very little practical information. But there are pretty pictures that non-Geeks might like. (Not the construction pictures, the ones further down.)

First, I should explain why I haven’t posted much lately. Lensrentals was able to expand into some adjacent space, which was desperately needed. But the testing and repair departments were moved and expanded into the new space. Over the last 10 days the second testing area went from this . . .

to this.

To this.

We were finally able to get some new repair workstations set up.

And moved OLAF to his own room where he’ll be joined shortly by our new MTF bench.

Pictures from OLAF

We did manage to squeeze in some time with OLAF to explore his capabilities a bit – although we aren’t done with that by any means. One of the first things we needed to do was set up standards for various lenses. Basically, this means testing known good and bad copies of different lenses so we can see what the reversed point spread images from OLAF should look like, and what they look like for different types of misalignment.

People like to think that adjusting a lens just means turning a knob somewhere and making all of the the points nice and sharp. That is sort of true in the very center of the lens (not the turning a knob part, but making the central point nice and sharp).

Right at the center, every lens is capable of resolving a clear point. This image shows the center point of a lens before (top) and after (bottom) adjustment.

For a few lenses, like the Canon 50mm f/1.2 L and Nikon 70-200 f/2.8 VRII, sometimes that’s all that’s needed. OLAF is a champ with this kind of thing and lets us do these adjustments in minutes rather than hours.

Unfortunately, centering one element is not the most common type of lens adjustment. Other adjustments require evaluation of things away from the center (properly called “off-axis”). Once you’re away from the center, five micron points don’t look much like points, even with very good lenses that are well-adjusted optically. In theory, a lens designer can calculate out what the various aberrations for a given lens should make a point look like off-axis.

The above patterns taken from “Generic Misalignment Aberration Patterns and the Subspace of Benign Misalignment” Paul L. Schechtera and Rebecca Sobel Levinsona. Catchy title, isn’t it?

In reality, you don’t get one type of aberration at an off-axis point. You get a combination of several. Sometimes the lens designer uses one aberration to counteract another. Sometimes there will be two or three small aberrations instead of one large aberration, etc. Plus, when a lens has a decentered, tilted, or misspaced element, things get really interesting. The end result is that off-axis points on a real lens usually don’t look like any of the above patterns, especially when you look at them in color and can also see chromatic aberrations.

We just take the brute force approach, checking half-a-dozen known good copies of each lens to see what the off-axis patterns should look like, then compare that to the misadjusted lenses. While our database is far from complete, I find it fascinating to see how different lenses ‘interpret’ a point of light off-axis. It’s rather pretty, actually, so I thought I’d share a few images.

What points near the edge of the frame look like for various lenses. When there are fewer, larger points the lens is wider angle (16 to 24mm in this picture), while smaller, closer points are longer lenses (70mm to 100mm). 

 

The ‘points’ above are along the lateral edge of the field, from good-quality, well-adjusted lenses. Every one of those lenses resolves a nice round point in the center just like the ‘adjusted’ lens in the image above. This is what the ‘point’ looks like along the lateral edge. (These were all focused for the edge, so field curvature isn’t playing a part.)

Before you run screaming into the hills, remember these are magnifications of five micron points of light. Plus, the way OLAF is made overly enlarges the points of wide angle lenses, so those seem a lot worse. Also remember the pixels on your camera are likely larger than five microns, so a lot of the smearing you see at this magnification isn’t particularly meaningful.

If I take that image above, convert it to black and white, and reduce the dynamic range about 25% in Photoshop, it looks more manageable. In fact, if you scroll back up, it looks a bit like some of the textbook pictures of various aberrations. Especially when you consider that most lenses have more than one aberration affecting off-axis points.

 

Anyway, the picture above showed a lot of apple-to-oranges comparisons – those were very different lenses. Looking at the outer pixels for some similar lenses gives an interesting apples-to-apples comparison. Below are right side pixels from half-a-dozen 35mm focal length lenses.

Top to bottom: Sigma 35mm f/1.4; Canon 35mm f/1.4; Zeiss 35mm f/1.4; Canon 35mm f/2 IS; Samyang 35mm f/1.4; Canon 35mm f/2 (old version)

 

Here are lateral edge points from four different 50-something mm lenses.

Clockwise from top left: Nikon 58mm f/1.4; Zeiss Otus 55mm f/1.4; Canon 50mm f/1.2 L; Zeiss 50mm f/1.4. All were shot at widest aperture, so the Canon 50mm f/1.2 is at a bit of a disadvantage.

 

I don’t think this is important data for evaluating lenses, or anything. But it is kind of interesting to see how a lens renders a point of light off-axis. I tend to roll my eyes a bit when I hear someone say, “This lens is just as sharp in the edges as it is in the center,” because no lens is. Ever. This explains why I prefer to shoot my subject in the center of my image and then crop for composition. I can tell myself those off-axis aberrations don’t really affect the image much — but I still know the round points are all in the center. Sometimes being a Geek has disadvantages.

On the other hand, looking at this you’d think no lens could ever resolve anything recognizable at the edge of the image. So just for fun, I took the 50mm images above, put them in Photoshop, converted to luminance channel, and knocked off the top and bottom 30% of the luminance with the levels command in about 14 seconds. Guess what the images above look like now? Yeah, they’re pretty much points. This is a simple kind of thing that any modern camera could (or does) do in firmware or we can easily do in postrpocessing.

 

Does this have any point? Maybe a bit. For example, if you look at what a point looks like near the center of the image compared to one near the edge, you may understand why I’m such an advocate of using masks when I sharpen or otherwise post-process an image. I usually want more effect out near the edges and less near the center. Below are images for a central point and a lateral edge point for four lenses, all of which are optically good copies of what are considered excellent lenses. It doesn’t seem logical I’d want to apply the exact same manipulation to each.

This also demonstrates why in-camera image processing is becoming so popular with manufacturers. It’s pretty simple, with a lens database in place, to process the images so that they look better than they would without specific processing.

The central point (left side) and one from the lateral edge (right side) for four different lenses.

 

We Do Really Use OLAF

Ok, the pictures and stuff are fun, but that’s not why I bought this machine (although the pictures are a nice side benefit). I bought it to adjust decentered lenses. For adjustments, all of the chaos of different patterns isn’t particularly important.

In addition to making the center point nearly a point, we get to look at the two sides and compare them. If the sides don’t look about the same, the lens isn’t quite right, even when the center has been adjusted to a nice point. Below are images comparing the left and right points from three different lenses. It’s pretty obvious that the sides aren’t alike and therefore the lens is optically out of sorts.

 

For the middle and bottom images, the points on one side are good and we simply need to get the opposite side to match. The lens in the upper image is really in a bad way and neither side looks anything like it should. This is when we pull up a saved file of what the lens should look like so we know what we’re trying to get to.

Of course, once we’ve got the two sides improved, we have to rotate the lens to several positions and make further adjustments until it’s the same in all quadrants.

It’s simple, although it’s often not easy. Each lens has different adjustable elements and each adjustment has different effects. Some lenses still take us hours to adjust properly. But being able to do those adjustments while we watch has reduced our optical adjustment times by 50%, which is a really big deal. Even more importantly, we’ve been able to correct a dozen lenses that neither we, nor the factory service center were able to adjust at all before.

It’s early in the learning process, so I can’t say with any certainty that OLAF is the universal answer to optical adjustments. But it’s certainly going to be a big help. More than that, though, it’s helping us learn more about lenses, so it’s already worth the money.

 

Roger Cicala

Lensrentals.com

April, 2014

Author: Roger Cicala

I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.

Posted in Equipment
  • NancyP

    Hey,those are entertaining aberrations. It is not too hard to keep a geek happy. I have to chuckle when I see the astro-landscape photos I have made from old planar design fast primes – enormous bat-wing coma at f/1.2 or f/1.4 or whatever the lens’ wide open aperture. They are coma-ing to get us, on wings of aberration.

  • Grant

    Are there competent lens adjustment services available around the world? (including, ahem, Australia). I am struck by how much better a lens is after adjustment. Wouldn’t it be worth getting our favourite lenses adjusted by a specialist?

  • Aaron

    Hehe, nice Roger. I can’t wait to hear some stories about the service centers when you start sending MTF results along with the lens and hear what you get back from them.

  • Roger Cicala

    Nqina – same as with an SLR – dust on a camera sensor.

  • Graham Stretch

    Hi Roger.
    Thanks for that, nothing worse than a workflow that involves knowingly undoing previous corrections with the next adjustment.
    I would love to be a fly on the wall of the service centre that first receives the 50% resolution difference message, and then each incremental step up the customer service / quality control chain to see the reaction this creates.
    Think it might go something along these lines?
    “Another (pick a lens name) for checking for out of centre and softness.”
    “You know the drill just send it back as within limits.”
    “But, but, this guy has proof, he has this from his MTF bench.”
    “Oh my (insert deity) now what? (sobbing noises, head in hands) we will have to send it upstairs, they’ll be furious. Oh well it’s been nice working with you.”

    Cheers Graham.

  • Nqina Dlamini

    Enjoyed the article.
    What is the little “dirt” dot on your images?

  • Steven

    Roger,

    This is all so very cool… I really do hope one day you can offer calibration/adjustments. I would LOVE to send all my lenses to your team to check, adjust and make sure they operate properly. Canon can’t seem to do things right when I send stuff to them, so I would gladly pay for your team to try (and have more confidence…)

  • Roger Cicala

    Jose, it’s a 600mm mirrored collimator and is supposed to be as aberration free as can be made. Since the center point collimator generally shows no aberrations it is, I think, aberration free enough for what we’re trying to do with it.

    SadGuy, it will get worse for them when we get our new MTF bench. After that when they say something is in spec, we’ll be able to send a lovely note saying something like “Just want to be sure that I understand your company considers a 50% resolution difference between the left and right sides to be ‘within spec’.

  • SadGuy

    OLAF is the very essence of disruptive technology—especially if LensRentals starts testing lenses for customers.

    Here’s why: If people flood Nikon, Canon, et. al. with bad lenses and include objective measurements of their problems from OLAF, it could be a bit of a problem because from what you’ve written, it just doesn’t sound like any of these companies are capable or willing to really get lenses dialed in.

    So we are at the beginning of a customer service apocalypse that will make the Nikon D600 debacle look like a Sunday school picnic.

    Sadly, as the instigator of this rebellion, Roger’s life won’t be worth a plugged nickel.

    So, it was nice reading your blogs. And when you get to Heaven please find out for us whether God shoots Nikon or Canon; maybe you could let us know via a message that miraculously appears on a hush puppy.

  • Jose

    The size of the spot light should be small enough to reveal the aberrations of the lens under test, but do not need to have a relation with the pixel size of the camera to be used with the lens.

    It is clear to me that the lens under test works in reverse mode compared with the normal use. The lens works as a kind of close-up lens for the collimator. The collimator takes the spot light image (at infinity) and projects it on the OLAF’s sensor, with a magnification equal to the ratio between the focal lengths of the collimator and the lens under test. I wonder what is the F.L. of the collimator, and if the collimator’s aberrations can be ignored, especially when the F.L. of the lens under test is 400, 500, 600, 800mm, or more.

  • Roger Cicala

    John, on a 24 mlix crop sensor, of course, but not on full-frame. Even the D800 is nearly 5 microns (4.9 if I recall correctly). These examples were all done at field-of-view edges for a full-frame right. I’d have to pull back the angle of view for a crop sensor a bit. There would still be aberrations, of course, but they would not be quite as severe. Although this does raise a pertinent question: do the less severe aberrations on a crop sensor (since the angle of view is lower) offset the smaller pixel size when it comes to resolving edge pixels? I don’t know the answer, but it’s an interesting question.

    Of course, that leads to lots of other questions about lenses designed specifically for smaller sensors versus full-frame lenses shot on smaller sensors. But not an easily answered question, I think. The answer will be different for each lens comparison. For example, what would be a ‘sombrero’ field curvature on a full-frame might simply be horrid edge curvature on a crop.

Follow on Feedly