Lenses and Optics

Notes on Lens and Camera Variation

Published October 2, 2011

A funny thing happened when I opened Lensrentals and started getting 6 or 10 copies of each lens: I found out they weren’t all the same. Not quite. And each of those copies behaved a bit different on different cameras. I wrote a couple of articles about this: This Lens is Soft and Other Myths talked about the fact that autofocus microadjustment would eliminate a lot, but not all, of the camera-to-camera variation for a give lens. This Lens is Soft and Other Facts talked about the inevitable variation in mass producing any product including cameras and lenses: that there must be some real difference between any two copies of the same lens or camera.

A lot of experienced photographers and reviewers noted the same things and while we all talked about it, it was difficult to use words and descriptions to demonstrate the issue.

And Then Came Imatest

We’ve always had a staff of excellent technicians that optically test every camera and lens between every rental. But optical testing has limitations: it’s done by humans and involves judgement calls. So after we moved and had sufficient room, I spent a couple of months investigating, buying, and setting up a computerized system to allow us to test more accurately. We decided the Imatest package best met our needs and I’ve spent most of the last two months setting up and calibrating our system (Thank you to the folks at Imatest and SLRGear.com for their invaluable help).

It has already proven successful for us, as it is more sensitive and reproducible than human inspection. We now find some lenses that aren’t quite right, but that were perhaps close enough to slip past optical inspection. Plus the computer doesn’t get headaches and eyestrain from looking at images for 8 to 10 hours a day.

Computerized testing has also give me an opportunity to demonstrate the amount of variation between different copies of lenses and cameras. We have dozens (in in some cases dozens of dozens) of copies of each lens and camera. While we don’t perform the multiple, critically exact measurements that a lens reviewer does on a single copy, performing the basic tests we do on multiple copies demonstrates variation pretty well.

Lens-to-Lens Variation

We know from experience that if we mount multiple copies of a given lens on one camera, each one is a bit different. One lens may front focus a bit, another back focus. One may seem a bit sharper close up, another is a bit sharper at infinity. But most are perfectly acceptable (meaning the variation between different copies is a lot smaller than the variation you’re likely to detect in a print). I can tell you that, but showing you is more effective.

Here’s a good illustration, a run of 3 different 100mm lenses, all of which are known to be quite sharp: the original Canon 100mm f/2.8 Macro, the newer Canon 100mm f/2.8 IS L Macro, and the Zeiss ZE 100mm Makro. The charts shows the highest resolution (at the center of the lens) across the horizontal axis, and the weighted average resolution of the entire lens on the vertical axis, measured in line pairs / image height. All were taken on the same camera body and the best of several measurements for each lens copy is the one graphed.

Resolution of multiple copies of several 100mm lenses

It’s pretty obvious from the image there is variation among the different copies of each lens type. I chose this focal length because there was a bad lens in this group, so you can see how different a bad lens looks compared to the normal variation of good lenses. As an aside, the bad lens didn’t look nearly as bad as you would think: if I posted a small JPG taken with it, you couldn’t tell the difference between it and the others. Blown up to 50% in Photoshop, though, the difference was readily apparent.

My point, though, is while the Canon 100mm f/2.8 IS L lens is a bit sharper than the other two on average, not every copy is. If someone was doing a careful comparative review there’s a fair chance they could get a copy that wasn’t any sharper than the other two lenses. I think this explains why two careful reviewers may have slightly different opinions on a given lens. (Not, as I see all too often claimed on various forums, because one of them is being paid by one company or another. Every reviewer I know is meticulously honest.)

Autofocus Variation

We all know camera autofocus isn’t quite as exact as we wish. (Personally, after investigating how autofocus works for this article, I’m amazed that it’s as good as it is, but I still complain about it as much as you do.) But when I started setting up our testing, I was hoping we could use autofocus to at least screen lenses initially. The results were rather interesting. Below is the same type of graph for a set of Canon 85mm f/1.8 lenses I tested using autofocus. Notice I again included a bad copy as a control.

 

Test run of a dozen Canon 85mm f1.8 lenses (and one known soft copy) using autofocus

(For those of you who are out there thinking “I want one of those top 3 copies, not one of the other ones”, and I know some of you are, keep reading.)

Then I selected one copy that had average results (Copy 7), mounted it to the test camera, and took 12 consecutive autofocus shots with it. Between each shot I’d either manually turn the focus ring to one extreme or the other, or turn the camera off and on, but nothing else was moved. (By the way, for testing the camera is rigidly mounted to a tripod head, mirror lock up used, etc.)

In the graph below, overlaid on the original graph, the dark blue diamond shapes are the 12 autofocus results from one lens on one camera. Then I took 6 more shots, using live view 10x manual focus instead of autofocus, again spinning the focus dial between each shot. The MF shots are the green diamonds. I should also mention that when I take multiple shots without refocusing the results are nearly identical – that would be a dozen blue triangles all touching each other. What you’re seeing is not a variation in the testing setup, it’s variation in the focus.

Copy 7, repeatedly autofocused (blue diamonds) and manually focused (green triangles)

It’s pretty obvious that the spread of sharpness of one lens focused many times is pretty similar to the spread of sharpness of all the different copies tested once each. It’s also obvious that live view manual focus was more accurate and reproducible than autofocus. Of course, that’s with 10X live view, a still target, and a nice star chart to focus on and all the time in the world to focus correctly. No surprise there, we’ve always known live view focusing was more accurate than autofocus.

One aside on the autofocus topic: Because it would be much quicker for testing, I tried the manual versus autofocus comparison on a number of lenses. I won’t bore you with 10 more charts but what I found was that older lens designs (like the 85 f/1.8 above) and third party lenses had more autofocus variation. Newer lens designs, like the 100mm IS L had less autofocus variation (on 5DII bodies, at least – this might not apply to other bodies).

Oh, and back to the people who wanted one of the top 3 copies: when I tested two of those repeatedly, I never again got numbers quite as good as those first numbers shown on the graph. The repeated images (including manual focus) were more towards the center of the range, although they did stay in the top half of the range, at least on this camera, which provides me an exceptionally skillful segue into the next section. (My old English professor would be proud. Not of my writing skills, but simply that I used segue in a sentence.)

Camera to Camera Variation

Well, we’ve looked at different lenses on one camera body, but what happens if we use one lens and change camera bodies? I had a great chance to test that when we got a shipment of a dozen new Canon 5D Mark II cameras in. First, I tested a batch of Canon 70-200 f2.8 IS II lenses on one camera, using 3 trials of live view focusing on each. The best results for each lens are shown as green triangles.

Then I took one of those lenses (mounted to the testing bench by its tripod ring) and repeated the series on 11 of the new camera bodies. The blue diamonds and red boxes this time each represent a different camera on the same lens. (4 test shots were taken with each camera, and while the best is used, each camera’s four shots were almost identical.) Obviously the same lens on a different body behaves a little differently.

A group of Canon 70-200 f2.8 IS II lenses tested on one body (green triangles) and one of those lenses tested on 11 brand new Canon 5DII bodies (red squares and blue diamonds).

I separated the cameras into two sets because we received cameras from two different serial number series on this day. I don’t know that conclusions are warranted from this small number, but I found the difference intriguing. And maybe worth some further investigation.

Summary

Notice I don’t say conclusion, because this little post isn’t intended to conclude anything. It simply serves as an illustration showing visually what we all (or at least most of us) already know:

  • Put different copies of the same lens on a single camera and each will vary a bit in resolution.
  • Put different copies of the same camera on a single lens and each will vary a bit in resolution.
  • Truly bad lenses aren’t a little softer, they are way softer.
  • Autofocus isn’t as accurate as live view focus, at least when the camera has not been autofocus microadjusted to the lens.

All of this needs to be put in perspective, however. If you go back to the first two charts, you’ll notice the bad copies are far different than the shotgun pattern shown by all the good copies. And when we looked at those two bad copies, we had to look fairly carefully (looking at 50% jpgs on the monitor) to see they were bad.

The variation among “good copies” could probably be detected by some pixel peeping. For example if you examined the images shot by the best and worst Canon 100 f2.8 IS L lenses you could probably see a bit of difference if you looked at the images side-by-side (the images I took on my test camera). But if I handed you the two lenses and you put them on your camera, they’d behave slightly differently and the results would be different.

So for those of you who spend your time worried about getting “the sharpest possible lens”, unfortunately sharpness is rather a fuzzy concept.

Roger Cicala

Lensrentals.com

October, 2011

 

Addendum:

Matt’s comment made me realize I hadn’t talked about one obvious variable in this little post: how much of the variation is caused by the fact that these are rental lenses that have been used? The answer (at least for Canon prime lenses) is not much, if at all. For example the graph below compares a set of brand new Canon 35mm f/1.4 lenses tested the day we received them (red boxes) to a set taken off of the rental shelves (blue diamonds).

Comparison of stock 35mm f/1.4 lenses with new-from-box copies

Please note I make this statement only for Canon prime lenses. Zooms are more complex and I see at least one zoom lens that doesn’t seem to be aging well, but until I get more complete numbers to confirm what I think I’m seeing I won’t say more. I see no reason to expect other brands to be different, but at this point we’ve only been able to test Canon lenses (these tests are pretty time consuming and we have a lot of lenses).

 

Author: Roger Cicala

I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.

Posted in Lenses and Optics
  • Roger Cicala

    David,

    I totally think it is, or perhaps the idea of phase detection using the actual image sensor. But anything that eliminates focusing using one sensor and shooting the image with another has got to reduce variability.

    Roger

  • Nice work, I know it’s got to be excruciatingly tiring and for that, we thank you! I’m reminded of when I used to target shoot. Every gun (camera) has a preference to a style and weight of bullet (lens) and a particular powder/charge weight. The trick for a competition shooter (pro photog) is to fine tune his tool of choice to perform at it’s optimum so that the shooter himself could perform at his. It truly is a matter of mass produced product and variation, but with today’s camera’s being capable of micro adjustments, it pays to spend a little time fine tuning your equipment…not only to make sure they are working together but so that you know the limitations of your product and are thereby able to get the most out of it.

    That said, I personally do not like returning anything unless it is absolutely necessary. So I won’t be one of those in the perpetual search for the optimum lens for my camera body. This kind of report helps me to understand the necessity of dialing in the correct micro-adjustments in each body for each lens and then using what you have to create the best magic possible.

    Thanks for a job well done. I sincerely hope this helps many more people understand their equipment and get busy shooting not swapping.

  • Dave Sucsy

    Much thanks, Roger. You blogs & posts and thinking are great and very appreciated. And I will continue to show my thanks by renting from you whenever I have the need.

  • Thanks for another interesting article. I guess I knew that lenses and bodies acted the way you describe, I’ve managed a Aerospace laboratory, and results always vary.

    Still, its very good to see some data that confirms my gut feeling about lenses and bodies. I have seen the variation with one particular wide aperture Canon zoom that I had five copies before giving up on finding one that I thought was to specification. I find that most of my lenses are very good once the AF is tweaked, but I also feel that the focus accuracy sometimes varies with distance. Something else you might look at one day.

  • David Stock

    Great article; thanks.

    One thing I’ve been wondering lately is whether contrast-detect autofocus, once seen as a cheap but second-rate option, is actually the future.

    As higher-end cameras start using it (either as their only autofocus sytem or for live view), and as advances are made in contrast-detect speed, it seems worth investigating how well this system compensates for not only lens sample-to-sample variation but also camera calibration issues and focus shift.

    Would there be as much difference among your samples if they were focussed using contrast-detect autofocus? Can contrast-detect autofocus compensate for the relatively loose tolerances of the current manufacturing processes? What are the prospects for really fast contrast detect systems in the future?

    I’ve noticed that contrast-detect autofocus, when available, often makes better use of my lenses optically than the phase-detect version, even on the same camera. In fact, I sometimes use contrast detect autofocus to calibrate my lenses for phase-detect focussing.

  • Lonny Smart

    Thanks,very interesting, now I wonder what happend if you threw microadjustment into the mix.

  • Mel Gross

    Roger, you mentioned the micro adjust. I know this is time consuming from my own experience. But, what might we see as far as sharpness values once that is done, as well as other values? Is it possible that these numbers might change with worse lenses becoming better than better lenses without it? I can’t get multiple lenses to try that.

  • Ruy Penalva

    Roger,

    I think these results are expected, chiefly in lens, but in camera also. In industries, as in nature, nothing is able to repeat identical, maybe the DNA. Probably quality control in camera and lens makers does not include photografic tests, but pure electronics tests. Congratulation to show what I already felt. Congratulations

  • Michael Kasper

    Fantastic article. Thanks Roger.

    Do you guys do auto-focus calibration for people’s personal gear? That is, if I send you my camera & lenses, will you calibrate them and let me know if anything is amiss? If not, this could be a nice bolt on service for Lensrentals!

    BTW, the 8-15 Fisheye I rented from you guys a month ago was great. I would have bought it from you at the end of the rental period. Rent to own is another idea… Credit a portion of the rental costs? Perhaps only on certain lenses that aren’t hard to get.

  • Clay Taylor

    Roger –

    Informative and amusing, as always. Great job.

    Given that I am a photodinosaur who normally manually “tweaks” (via the groundglass focusing screen) whatever AF setting my camera makes (I would have to resort to reading glasses to use the LCD screen’s 10x MF assist), I wonder about the accuracy of THAT system.

    What would the graph spread be for a lens that was Manually Focused using only the groundglass screen, compared to the AF and 10x MF Assist clusters for that same lens?

  • Roger,

    Thanks for sharing your test results. Though I have no similar empirical evidence I remain convinced that a certain lens on a specified body can perform differently on different days. I photography fast moving sports (rugby, athletics) and using a Canon 70-200 f2.8 IS II and some days I find myself having to sharpen the jpegs and on other days the raw files come out razor sharp.

    It might be the weather, how much coffee I had that morning but the results on one day are consistent for that day but not necessarily consistent with the shots from another day. Or it could just be my imagination;-)

    The lens is micro adjusted monthly.

  • Kjetil Johannessen

    Hi
    Good work, and interesting. The only part of this I would like to have seen a liitle more is how good is the meassurment. I would propose to use the same camera and lens, but set up the same combination for test say at least 8 times (as if you took a new combination) To be sure disconnect the lens from the camera each time, and finally the last important align the camera with the test target as for a new camera each time. This would give a baseline for the measurment uncertainty itself.

  • Merrick

    So can we conclude that because there’s variation from live view focusing that we could expect focus variation on cameras that use the image sensor for focusing, such as Sony NEX or micro 4/3 cameras?

  • Grant Zabro

    As informative as these plots appear to be, in reality they’re mostly useless, as error bars are not shown on any of the data points, and we have no idea how significant the differences between the various measurements are. Can you really distinguish between a resolution of 750 and 751? 750 and 760? When you average across the lens, are you accounting for both systematic and statistical errors? How reproducible is the setup? I’m sure that there are real differences between lenses and camera bodies, but without a proper error metric, it’s impossible for us to tell how significant these differences really are.

  • Craig Luna

    Roger,
    Since you have a unique perspective given the quantity of returns and service you probably deal with, could you please answer a question.

    When you send a lens in for service, esp Nikon, what is the typical recourse of an out of spec focus. That you can tell, do they replace and tighten up the gears to reduce backlash, adjust optics for zero, or does the lens get programmed with re-calibration?

    Thanks and hoping you have been fortunate enough to have been enlightened which approach they pursue!

  • John Geisendorfer

    Excellent article & information. We see so much user information on the web that is not backed by good data, or we fear influenced by those selling something. So, now I know thee really are bad copies, but they are very likely really bad. I also know that most are fine & that me trying to nit pick the best is questionable. Thanks & I look forward to more.

  • Dan Tong

    This is the most inteliligent and valuable article about photographic equipment to be posted anywhere. It’s the kind of work dpreview should have done years ago. Congratulations for doing this.

    Dan

  • Gary

    Perhaps one other line or arc on the graph–that point of resolution at which it’s not possible to detect a difference with the unaided eye. That is, are all the resolution differences on the above graphs occurring substantially above what the eye can see anyway? I’m reminded of extraordinarily good sound gear–should you pay for headphones that produce a sound range most humans can’t hear?

  • Chris

    Thanks so much for these articles, Roger. Your tests are a huge asset to the community. The vast majority of reviews are of a single sample, and as you’ve proven with this and other articles that’s not enough data points to speak conclusively about a lens model.

    In your article you mentioned how newer lens designs seem to be more accurate, “on 5DII bodies, at least”. Anecdotally, I had a teeth-gnashing experience related to this.

    I had a 5D1 and a Tamron 28-75/2.8 that worked beautifully together. I shot portraits with it, and I’d get perfect AF on the eyes >90% of the time.

    Later I got the wildlife bug and bought a 40D. Its AF was very good with the 100-400 and 300/2.8IS I used for wildlife, but with the Tamron 28-75 I had very frequent AF misses. Headshots at 75mm, f/2.8 often missed badly enough that the eyes were OOF, either front or back focused randomly. I’d put my keeper rate down around 50%. Not usable at all.

    Later still I bought a 7D and experienced the same problem. Since I had sold my 5D1, this time I decided to use AF microadjust. I bought a LensAlign and set to calibrating all my lenses. The 300/2.8IS was off consistently by a setting of +1 or +2. The 100-400 was off by +2. The 100/2.8 USM was spot on. The Tamron was off randomly by so much that the range of the adjustment wouldn’t be enough to correct it, if the error was even consistent. Which it wasn’t… I tested by defocusing near and far then letting the camera AF. There was no rhyme nor reason to the results that I could gather.

    My current body is a 5D2, and I was saddened to see that it wasn’t just my crop bodies that were problematic. The Tamron has the same problem on the 5D2 as well. A lens that was perfect almost every time on the 5D1 is inconsistent on any body newer than a 5D1. (For what it’s worth, my old 350D never had a problem, either.) I have since bought a 24-70/2.8 (a new one) and my focus inconsistencies are gone. Unfortunately that means I now need to carry around a heavier, more expensive, and less sharp lens.

    My hypothesis from this is that sometime after the 5D Canon changed their AF algorithms enough that some older or third-party lenses don’t focus consistently. Long story short, I think there’s something to your thought that the age of the equipment might have something to do with accuracy. Not because the equipment is old, but because newer equipment has newer algorithms that work differently.

    Have any 5D1’s or older lying around to test, Roger? 🙂

  • Sylvain

    Thanks Roger, very instructive article, as always.

    Wish there was an european LensRentals 🙂

  • Allan Sheppard

    Hi Roger,
    As a long time reader of your excellent articles thanks for another common sense note.

    Canon has stated (I don’t have a link) that their specification for AF performance is that the AF is inside the depth of focus for that F stop/distance.

    Could you run an experiment with one of your (say) 100L lenses the results when you focus on the calculated DOF range – distance plus 1/2 DOF or distance minus 1/2 DOF and see how the numbers change. I realise this brings in CoC, etc, but would help matching Canon’s specifications to performance.
    Cheers,
    Allan

  • Roger Cicala

    Hi Matt,

    Great minds think alike — testing for changes over time is one of our major goals. The most accurate way will be as we follow lenses over time in service. The less accurate way we’re already doing, which is comparing older and new copies which we’re doing now (of course this will also show a difference if an unannounced improvement has occurred in newer lenses).

  • Great article.
    I had heard a little about how different lenses produced different results, especially when used on different bodies. But I hadn’t heard much about autofocus variation.

    For those interested, the luminous landscape website recently posted an article about back focus:http://www.luminous-landscape.com/essays/are_your_pictures_out_of_focus.shtml

    Also, I’d love to see the test results of the same lenses over time. It would be an interesting test to find out which lenses are better made and retain sharpness after heavy use.

  • Pingback: Quantifying the Variation in Quality for Specific Lenses()

  • and that is why this blog is freaking awesome

  • Tenisd

    Pretty cool post. Thank You 🙂 Saw on twitter.

  • Chester

    Thanks for this post Roger. This makes me want to do the long overdue auto-focus calibration on my camera.

  • Roger, Thanks for the time you spent on this testing, analysis and most importantly, the explanation. This type of info is extremely valuable to those of us new to the craft. BTW, I bought a used 100-400 from you after renting one from you 2 years ago. It’s still on my camera and my most often used lens, its also worth more now (used) than when I purcahsed it—–Thanks and love your service.

    Bob

  • Roger Cicala

    SLR Gear is totally correct: focus bracketing is the most accurate way to do it – it’s the only way to eliminate the camera-lens variability I’m writing about. In our situation, though, we need to examine and evaluate that same variability because it’s going to affect our customer’s use of the lens.

    If a lens is optically superb but backfocuses horribly we need to know that so we can get that corrected, because our customer is almost certainly going to use the lens on autofocus mode. But for testing and review purposes, SLR Gear needs to determine the best possible performance of each lens, and focus bracketing is the only way to do that. I believe, in fact, that they developed that technique – they are meticulous and their reviews reflect that.

    Roger

  • neuroanatomist

    Thanks for this – like your other blog posts, it really adds to the knowledge base!

    One comment about the AF vs. MF live view behavior, and that’s based on the testing procedure at SLR gear. They point out that for careful testing, neither AF nor MF with Live View is accurate enough, and instead they take a series of focus-bracketed shots (done by moving the camera on a rail), and use the sharpest one for the test.

Follow on Feedly