Lenses and Optics

Notes on Lens and Camera Variation

Published October 2, 2011

A funny thing happened when I opened Lensrentals and started getting 6 or 10 copies of each lens: I found out they weren’t all the same. Not quite. And each of those copies behaved a bit different on different cameras. I wrote a couple of articles about this: This Lens is Soft and Other Myths talked about the fact that autofocus microadjustment would eliminate a lot, but not all, of the camera-to-camera variation for a give lens. This Lens is Soft and Other Facts talked about the inevitable variation in mass producing any product including cameras and lenses: that there must be some real difference between any two copies of the same lens or camera.

A lot of experienced photographers and reviewers noted the same things and while we all talked about it, it was difficult to use words and descriptions to demonstrate the issue.

And Then Came Imatest

We’ve always had a staff of excellent technicians that optically test every camera and lens between every rental. But optical testing has limitations: it’s done by humans and involves judgement calls. So after we moved and had sufficient room, I spent a couple of months investigating, buying, and setting up a computerized system to allow us to test more accurately. We decided the Imatest package best met our needs and I’ve spent most of the last two months setting up and calibrating our system (Thank you to the folks at Imatest and SLRGear.com for their invaluable help).

It has already proven successful for us, as it is more sensitive and reproducible than human inspection. We now find some lenses that aren’t quite right, but that were perhaps close enough to slip past optical inspection. Plus the computer doesn’t get headaches and eyestrain from looking at images for 8 to 10 hours a day.

Computerized testing has also give me an opportunity to demonstrate the amount of variation between different copies of lenses and cameras. We have dozens (in in some cases dozens of dozens) of copies of each lens and camera. While we don’t perform the multiple, critically exact measurements that a lens reviewer does on a single copy, performing the basic tests we do on multiple copies demonstrates variation pretty well.

Lens-to-Lens Variation

We know from experience that if we mount multiple copies of a given lens on one camera, each one is a bit different. One lens may front focus a bit, another back focus. One may seem a bit sharper close up, another is a bit sharper at infinity. But most are perfectly acceptable (meaning the variation between different copies is a lot smaller than the variation you’re likely to detect in a print). I can tell you that, but showing you is more effective.

Here’s a good illustration, a run of 3 different 100mm lenses, all of which are known to be quite sharp: the original Canon 100mm f/2.8 Macro, the newer Canon 100mm f/2.8 IS L Macro, and the Zeiss ZE 100mm Makro. The charts shows the highest resolution (at the center of the lens) across the horizontal axis, and the weighted average resolution of the entire lens on the vertical axis, measured in line pairs / image height. All were taken on the same camera body and the best of several measurements for each lens copy is the one graphed.

Resolution of multiple copies of several 100mm lenses

It’s pretty obvious from the image there is variation among the different copies of each lens type. I chose this focal length because there was a bad lens in this group, so you can see how different a bad lens looks compared to the normal variation of good lenses. As an aside, the bad lens didn’t look nearly as bad as you would think: if I posted a small JPG taken with it, you couldn’t tell the difference between it and the others. Blown up to 50% in Photoshop, though, the difference was readily apparent.

My point, though, is while the Canon 100mm f/2.8 IS L lens is a bit sharper than the other two on average, not every copy is. If someone was doing a careful comparative review there’s a fair chance they could get a copy that wasn’t any sharper than the other two lenses. I think this explains why two careful reviewers may have slightly different opinions on a given lens. (Not, as I see all too often claimed on various forums, because one of them is being paid by one company or another. Every reviewer I know is meticulously honest.)

Autofocus Variation

We all know camera autofocus isn’t quite as exact as we wish. (Personally, after investigating how autofocus works for this article, I’m amazed that it’s as good as it is, but I still complain about it as much as you do.) But when I started setting up our testing, I was hoping we could use autofocus to at least screen lenses initially. The results were rather interesting. Below is the same type of graph for a set of Canon 85mm f/1.8 lenses I tested using autofocus. Notice I again included a bad copy as a control.

 

Test run of a dozen Canon 85mm f1.8 lenses (and one known soft copy) using autofocus

(For those of you who are out there thinking “I want one of those top 3 copies, not one of the other ones”, and I know some of you are, keep reading.)

Then I selected one copy that had average results (Copy 7), mounted it to the test camera, and took 12 consecutive autofocus shots with it. Between each shot I’d either manually turn the focus ring to one extreme or the other, or turn the camera off and on, but nothing else was moved. (By the way, for testing the camera is rigidly mounted to a tripod head, mirror lock up used, etc.)

In the graph below, overlaid on the original graph, the dark blue diamond shapes are the 12 autofocus results from one lens on one camera. Then I took 6 more shots, using live view 10x manual focus instead of autofocus, again spinning the focus dial between each shot. The MF shots are the green diamonds. I should also mention that when I take multiple shots without refocusing the results are nearly identical – that would be a dozen blue triangles all touching each other. What you’re seeing is not a variation in the testing setup, it’s variation in the focus.

Copy 7, repeatedly autofocused (blue diamonds) and manually focused (green triangles)

It’s pretty obvious that the spread of sharpness of one lens focused many times is pretty similar to the spread of sharpness of all the different copies tested once each. It’s also obvious that live view manual focus was more accurate and reproducible than autofocus. Of course, that’s with 10X live view, a still target, and a nice star chart to focus on and all the time in the world to focus correctly. No surprise there, we’ve always known live view focusing was more accurate than autofocus.

One aside on the autofocus topic: Because it would be much quicker for testing, I tried the manual versus autofocus comparison on a number of lenses. I won’t bore you with 10 more charts but what I found was that older lens designs (like the 85 f/1.8 above) and third party lenses had more autofocus variation. Newer lens designs, like the 100mm IS L had less autofocus variation (on 5DII bodies, at least – this might not apply to other bodies).

Oh, and back to the people who wanted one of the top 3 copies: when I tested two of those repeatedly, I never again got numbers quite as good as those first numbers shown on the graph. The repeated images (including manual focus) were more towards the center of the range, although they did stay in the top half of the range, at least on this camera, which provides me an exceptionally skillful segue into the next section. (My old English professor would be proud. Not of my writing skills, but simply that I used segue in a sentence.)

Camera to Camera Variation

Well, we’ve looked at different lenses on one camera body, but what happens if we use one lens and change camera bodies? I had a great chance to test that when we got a shipment of a dozen new Canon 5D Mark II cameras in. First, I tested a batch of Canon 70-200 f2.8 IS II lenses on one camera, using 3 trials of live view focusing on each. The best results for each lens are shown as green triangles.

Then I took one of those lenses (mounted to the testing bench by its tripod ring) and repeated the series on 11 of the new camera bodies. The blue diamonds and red boxes this time each represent a different camera on the same lens. (4 test shots were taken with each camera, and while the best is used, each camera’s four shots were almost identical.) Obviously the same lens on a different body behaves a little differently.

A group of Canon 70-200 f2.8 IS II lenses tested on one body (green triangles) and one of those lenses tested on 11 brand new Canon 5DII bodies (red squares and blue diamonds).

I separated the cameras into two sets because we received cameras from two different serial number series on this day. I don’t know that conclusions are warranted from this small number, but I found the difference intriguing. And maybe worth some further investigation.

Summary

Notice I don’t say conclusion, because this little post isn’t intended to conclude anything. It simply serves as an illustration showing visually what we all (or at least most of us) already know:

  • Put different copies of the same lens on a single camera and each will vary a bit in resolution.
  • Put different copies of the same camera on a single lens and each will vary a bit in resolution.
  • Truly bad lenses aren’t a little softer, they are way softer.
  • Autofocus isn’t as accurate as live view focus, at least when the camera has not been autofocus microadjusted to the lens.

All of this needs to be put in perspective, however. If you go back to the first two charts, you’ll notice the bad copies are far different than the shotgun pattern shown by all the good copies. And when we looked at those two bad copies, we had to look fairly carefully (looking at 50% jpgs on the monitor) to see they were bad.

The variation among “good copies” could probably be detected by some pixel peeping. For example if you examined the images shot by the best and worst Canon 100 f2.8 IS L lenses you could probably see a bit of difference if you looked at the images side-by-side (the images I took on my test camera). But if I handed you the two lenses and you put them on your camera, they’d behave slightly differently and the results would be different.

So for those of you who spend your time worried about getting “the sharpest possible lens”, unfortunately sharpness is rather a fuzzy concept.

Roger Cicala

Lensrentals.com

October, 2011

 

Addendum:

Matt’s comment made me realize I hadn’t talked about one obvious variable in this little post: how much of the variation is caused by the fact that these are rental lenses that have been used? The answer (at least for Canon prime lenses) is not much, if at all. For example the graph below compares a set of brand new Canon 35mm f/1.4 lenses tested the day we received them (red boxes) to a set taken off of the rental shelves (blue diamonds).

Comparison of stock 35mm f/1.4 lenses with new-from-box copies

Please note I make this statement only for Canon prime lenses. Zooms are more complex and I see at least one zoom lens that doesn’t seem to be aging well, but until I get more complete numbers to confirm what I think I’m seeing I won’t say more. I see no reason to expect other brands to be different, but at this point we’ve only been able to test Canon lenses (these tests are pretty time consuming and we have a lot of lenses).

 

Author: Roger Cicala

I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.

Posted in Lenses and Optics
  • Jack C

    Nice article… the charts summarize the information very well.

    I would be very interested to see a similarly in-depth analysis of how Contrast Detect AF compares with Phase Detect AF in terms of accuracy and consistency of results

    -JC

  • Roger Cicala

    Dave,

    Remember, with these samples I purposely included some that had failed our standard inspection process (optical test-charts, etc.) so we would get some perspective on the much bigger difference between a bad lens and just sample variation among normal lenses. Bad copies, out of the box, are rare, but they happen. Maybe 1%, but some variation depending on brand, if it’s a newly released type of lens, is it a zoom, does it have IS, etc. We’re going to have more bad copies than that because our lenses are used heavily: a good lens that gets dropped becomes a bad lens most of the time.

    Roger

  • DaveD

    What is the rate of ‘bad copy’ lenses?

    I’m not overly concerned about the minor variations between the good copy lenses. But as a lone consumer, how do we decide if we have purchased one of those ‘bad copies?’ A little worrisome that you have 2 of those in this relatively small sample.

    In your experience, just how rare are the bad copies? Would a casual comparison with one’s other lenses detect one of these bad apples?

  • Ros

    I get much more consistent AF and hits with my GH1 CDAF then i did with my 40D,
    i dont shoot action, so for me i’ll never going back to PDAF, especially after all the trips to the lab for AF calibration (lens and body’s)

    Thanx for the article, keep em going!

  • jamesm007

    Oleg

    That’s how the Pentax service center tunes the focus on bodies as well. They don’t use charts. They put the camera on a special machine turn screws on the bottom until the PCs monitor which is loaded with a special program tells the tech AF is good.

    Just a few months ago when my K20D had almost 50,000 clicks on it, the rear edial started to act up and I sent it to Pentax Service in the USA. The tech checks all specs while on his bench before the camera comes back to me. He found AF out of whack and adjusted it.

    Now you must understand I thought my AF on the body was perfect. I would nail BIF like they were standing still. No problems with AF fine tuning my lens. Although some lens like the kit needed +8 to be decent. So where I stood my K20Ds AF was perfect.

    When I got it back and took pics with all my lens first thing I notice is the kit lens is better. So I start my very tedious procedure of setting AF fine tuning. Its hard because as the article stated their is AF variance even under perfect conditions. It took almost a week for me to set the AF fine tuning on my DA55-300mm (I do work).

    With AF tuning done, the kit lens now needed 0 AF fine tuning; before it needed +8. The DA55-300mm needed +5; before it needed 0. Those stand out to me. The others were about the same. The IQ of all lens stayed the same from what I can see except the kit (DA18-55mm WR) which got better.

    My conclusion is first time buyers are in for a long road of learning and that these articles should be a must read and for the newbie to take them to heart, and the maintenance schedule in the owners manual should be looked at at least. Many don’t even know dSLRs are suppose to have maintenance.

    I hope the author here can one day speak on maintaining dSLRs and if AF does fall out of whack over time. From his unique standpoint.

  • John Kennekam

    Many cameras now have fine-tuning as an option, but the whole process is labourious. Why not simply automate the process for the customer. It could work as follows:

    1. The camera and lens is mounted on a tripod.
    2. Focus pattern sheet is downloaded from the vendor’s web site and is printed out.
    3. The camera is pointed at the pattern sheet.
    4. No the magic happens. The camera automatically takes a series of images ranging from -20 to +20.
    5. The camera analyses which shot is the sharpest (using existing AF software) and and automatically save the setting for that lens.

    The above process is not new, it is just that the user must o steps 4 and 5 manually which takes forever if you have a lot of lenses.

  • Great article. Really good to see the variability of AF exposed like this. I had a 70-200 2.8L I sold earlier in the year as I was never happy with it’s focussing. It went back to Canon CPS 3 times but they could never find anything wrong with it. Bought a MKII and it’s great. Go figure!

  • Steve

    Great work on a subject that has been a thorn in my side for many years. I tend to shoot wide open with very fast lenses and have been tortured by AF misses. I think it’s a great argument for digital viewfinders, which I believe, is the best way to achieve consistent correct focus.

  • Roger Cicala

    That’s a very good point on the graphs: I was zooming the axis a bit to emphasize the difference, but probably should have started with a zero axis graph to show how tight the group really is.

    Thanks,
    Roger

  • John Jovic

    Nice job Roger, keep it up. You’re in a special position to be able to do this kind of testing and of course you don’t have to share your efforts with any one so it’s much appreciated that you do.

    JJ

  • An outstanding article. I especially appreciate the lack of sweeping generalizations, allowing your research to speak for itself. You’ve provided a great service to the pixel peepers amongst us!

  • J. Skinner

    Great article. Written better than most professional engineers could do.
    I really liked your filter tests too.

    I understand your graphs, but it takes some mental extrapolation to
    realize how tightly grouped the data points are.
    If one of your graphs went to zero-zero on the 2 axes it would appear
    to the casual reader that there is much less variation between the
    best and worst lenses and they would be less likely to send one back
    the the seller.

  • Roger Cicala

    Udi,

    It’s easy to find a good-camera lens match, but the lens that’s sharpest on camera A is not likely to be the sharpest on Camera B. If I run the same batch of lenses on another camera (which we’ve done) the overall data points will be similar, but the lenses will shift around within the grouping.

    Roger

  • Roger Cicala

    George,

    Those data points are random: lenses are not microfocus adjusted to the test camera. Our goal is to get sort of a worst case scenario since we’re trying to see how they’ll work for our customers.

    Roger

  • Hey Roger,

    Just another thank you for a great article. You are really in a unique position to provide insights like this and I certainly appreciate your doing so.

    Tim

  • Thanks for this article… a great read. Maybe I missed this in the article but were calibrations preformed on the cameras before the tests? I guess my root question is around the thought of… are these variations after calibration or out of the box variation?

  • I love this article! Best I have read on any photography tech subject in years. Before CAD and CAM, when cameras and lenses were designed and manufactured by craftsmen & not computers, We said the difference between a 60 dollar lens and a 3000 dollar lens was how many ended up on the trash pile instead of on store shelves. A great lens was rigorously checked before release, which is probably still done in the Cine world.
    Thank you Roger

  • eric

    Geesh, more people stealing my idea to use contrast deection to calibrate phase detection! Just kididng, but I posted that idea many months ago. Canon may have simply not thought of doing so….they could even have a firmware routine that almost totally automates the process (user would be promted to change focal lengths).

  • Pingback: Notes on Lens and Camera Variation - LensRentals.com | Photography Gear News | Scoop.it()

  • A great read, thanks for being a beacon of sanity 🙂

  • Chris

    One of the most most useful, informative articles I have ever read! Thank you sooo much for this. 🙂

  • Les Burns

    Great article.
    I remember years ago, at a large camera store in Chicago, they offered special selections of new Goerz gold dot lenses (supposedly the best) for a premium; I always wonderd if it was worth the money; at least I suppose it kept you from getting a dog.
    When I ordered M300 Nikors for my Gowlandflex (a 4×5 tlr), Olden got them to match the focal lengths tightly. Judging from what you said about Canon’s focus checking, it doesn’t seem as if focal length should be a factor in lens quality variation.
    I also remember hearing Leitz had a series of cams (or maybe an adjustment) to match variation in focal length on their rangefinder lenses.

  • Udi

    Roger,

    Your tests are very interesting. You have tested a set of identical lenses against one body, and then a single lens against multiple identical bodies.

    My question is – Can you actually use this to select a “good” body and a “good” lens, or is this a question of a combination of a specific body and lens.
    Asked otherwise – if you will run the same groups of lens against a different camera body than the one you just did – will the top performing lens remain the same one, or will we get a lens that wasn’t so good in the previous test to become a much better lens because the specific body and lens cancel out each other’s errors?

  • Ditto what Bart and Roger said about using contrast-AF to auto-microadjust phase-AF … surprising/disappointing that the manufacturers haven’t provided this capability.

    P.S. Any chance Roger of showing us some full-res images that better show the real-world impact of focusing/lens variability?

  • Pingback: Blog | Ideal Focus()

  • Roger Cicala

    Bart,
    I can’t think of any good reason they don’t do that. It seems to me it would be a simple software / firmware add on. And would be amazingly useful. I’d also love to be able to flip a switch and get to choose phase detection AF when I needed speed and contrast detection for accuracy.
    Roger

  • Another fantastic article! Thanks for keeping the information flowing, and public! You do a great service to the community.

    Will this be making it’s way over to canonrumors?

    I’m curious as to which zoom you need “more complete numbers” to talk about. Should be a good read, I’m sure!

    Keep up the great work,

    Cheers,

  • Oleg

    Roger,

    Regarding Canon calibration software:
    Yes, service uses special setup, but feature-wise it is basically
    vertical and horizontal focus targets with a ruler to measure focus error.
    First, they focus on targets, then replace it with ruler and make a shot.
    After that software analyzes the contrast to determine area of best focus.
    After that adjustments are made.
    So, this special equipment can be replaced by focus target (vertical of horizontal lines printer on paper) and a ruler. It will work slower than special equipment but it will work. The only problem is software.
    Canon is very serious about it. Installation is tied to a machine, it will not work on another machine and Canon manages keys for each installation. That is why that software is not well-known. It can run only on authorized machines.

  • Bart

    Hi Roger,

    I really love your articles, and especially the position you take; no conclusions, no big statements, just proper testing and modest judgements. Anyhow, I have a question; Can you think of a reason why manufacturers won’t let the CDAF (Contrast Detect AutoFocus) be used to calibrate the PDAF (Phase Detect AutoFocus)? I could imagine mounting a new lens, focusing with CDAF on a test-chart at different distances, with which the PDAF is calibrated. This would combine the pro of CDAF, of focusing on the image-plane, with the speed of the PDAF.

    Kind regards en thanks for your great articles!
    Bart

  • Roger Cicala

    Oleg,
    I know of the factory calibration software, but this is a very pertinent point. It can do a much more thorough job than we can with micro adjustment. My understanding is requires specific test targets and the setup is too expensive for smaller repair shops to afford, but I totally agree: it would be an awesome tool to have available.

Follow on Feedly