Lenses and Optics

Notes on Lens and Camera Variation

Published October 2, 2011

A funny thing happened when I opened Lensrentals and started getting 6 or 10 copies of each lens: I found out they weren’t all the same. Not quite. And each of those copies behaved a bit different on different cameras. I wrote a couple of articles about this: This Lens is Soft and Other Myths talked about the fact that autofocus microadjustment would eliminate a lot, but not all, of the camera-to-camera variation for a give lens. This Lens is Soft and Other Facts talked about the inevitable variation in mass producing any product including cameras and lenses: that there must be some real difference between any two copies of the same lens or camera.

A lot of experienced photographers and reviewers noted the same things and while we all talked about it, it was difficult to use words and descriptions to demonstrate the issue.

And Then Came Imatest

We’ve always had a staff of excellent technicians that optically test every camera and lens between every rental. But optical testing has limitations: it’s done by humans and involves judgement calls. So after we moved and had sufficient room, I spent a couple of months investigating, buying, and setting up a computerized system to allow us to test more accurately. We decided the Imatest package best met our needs and I’ve spent most of the last two months setting up and calibrating our system (Thank you to the folks at Imatest and SLRGear.com for their invaluable help).

It has already proven successful for us, as it is more sensitive and reproducible than human inspection. We now find some lenses that aren’t quite right, but that were perhaps close enough to slip past optical inspection. Plus the computer doesn’t get headaches and eyestrain from looking at images for 8 to 10 hours a day.

Computerized testing has also give me an opportunity to demonstrate the amount of variation between different copies of lenses and cameras. We have dozens (in in some cases dozens of dozens) of copies of each lens and camera. While we don’t perform the multiple, critically exact measurements that a lens reviewer does on a single copy, performing the basic tests we do on multiple copies demonstrates variation pretty well.

Lens-to-Lens Variation

We know from experience that if we mount multiple copies of a given lens on one camera, each one is a bit different. One lens may front focus a bit, another back focus. One may seem a bit sharper close up, another is a bit sharper at infinity. But most are perfectly acceptable (meaning the variation between different copies is a lot smaller than the variation you’re likely to detect in a print). I can tell you that, but showing you is more effective.

Here’s a good illustration, a run of 3 different 100mm lenses, all of which are known to be quite sharp: the original Canon 100mm f/2.8 Macro, the newer Canon 100mm f/2.8 IS L Macro, and the Zeiss ZE 100mm Makro. The charts shows the highest resolution (at the center of the lens) across the horizontal axis, and the weighted average resolution of the entire lens on the vertical axis, measured in line pairs / image height. All were taken on the same camera body and the best of several measurements for each lens copy is the one graphed.

Resolution of multiple copies of several 100mm lenses

It’s pretty obvious from the image there is variation among the different copies of each lens type. I chose this focal length because there was a bad lens in this group, so you can see how different a bad lens looks compared to the normal variation of good lenses. As an aside, the bad lens didn’t look nearly as bad as you would think: if I posted a small JPG taken with it, you couldn’t tell the difference between it and the others. Blown up to 50% in Photoshop, though, the difference was readily apparent.

My point, though, is while the Canon 100mm f/2.8 IS L lens is a bit sharper than the other two on average, not every copy is. If someone was doing a careful comparative review there’s a fair chance they could get a copy that wasn’t any sharper than the other two lenses. I think this explains why two careful reviewers may have slightly different opinions on a given lens. (Not, as I see all too often claimed on various forums, because one of them is being paid by one company or another. Every reviewer I know is meticulously honest.)

Autofocus Variation

We all know camera autofocus isn’t quite as exact as we wish. (Personally, after investigating how autofocus works for this article, I’m amazed that it’s as good as it is, but I still complain about it as much as you do.) But when I started setting up our testing, I was hoping we could use autofocus to at least screen lenses initially. The results were rather interesting. Below is the same type of graph for a set of Canon 85mm f/1.8 lenses I tested using autofocus. Notice I again included a bad copy as a control.


Test run of a dozen Canon 85mm f1.8 lenses (and one known soft copy) using autofocus

(For those of you who are out there thinking “I want one of those top 3 copies, not one of the other ones”, and I know some of you are, keep reading.)

Then I selected one copy that had average results (Copy 7), mounted it to the test camera, and took 12 consecutive autofocus shots with it. Between each shot I’d either manually turn the focus ring to one extreme or the other, or turn the camera off and on, but nothing else was moved. (By the way, for testing the camera is rigidly mounted to a tripod head, mirror lock up used, etc.)

In the graph below, overlaid on the original graph, the dark blue diamond shapes are the 12 autofocus results from one lens on one camera. Then I took 6 more shots, using live view 10x manual focus instead of autofocus, again spinning the focus dial between each shot. The MF shots are the green diamonds. I should also mention that when I take multiple shots without refocusing the results are nearly identical – that would be a dozen blue triangles all touching each other. What you’re seeing is not a variation in the testing setup, it’s variation in the focus.

Copy 7, repeatedly autofocused (blue diamonds) and manually focused (green triangles)

It’s pretty obvious that the spread of sharpness of one lens focused many times is pretty similar to the spread of sharpness of all the different copies tested once each. It’s also obvious that live view manual focus was more accurate and reproducible than autofocus. Of course, that’s with 10X live view, a still target, and a nice star chart to focus on and all the time in the world to focus correctly. No surprise there, we’ve always known live view focusing was more accurate than autofocus.

One aside on the autofocus topic: Because it would be much quicker for testing, I tried the manual versus autofocus comparison on a number of lenses. I won’t bore you with 10 more charts but what I found was that older lens designs (like the 85 f/1.8 above) and third party lenses had more autofocus variation. Newer lens designs, like the 100mm IS L had less autofocus variation (on 5DII bodies, at least – this might not apply to other bodies).

Oh, and back to the people who wanted one of the top 3 copies: when I tested two of those repeatedly, I never again got numbers quite as good as those first numbers shown on the graph. The repeated images (including manual focus) were more towards the center of the range, although they did stay in the top half of the range, at least on this camera, which provides me an exceptionally skillful segue into the next section. (My old English professor would be proud. Not of my writing skills, but simply that I used segue in a sentence.)

Camera to Camera Variation

Well, we’ve looked at different lenses on one camera body, but what happens if we use one lens and change camera bodies? I had a great chance to test that when we got a shipment of a dozen new Canon 5D Mark II cameras in. First, I tested a batch of Canon 70-200 f2.8 IS II lenses on one camera, using 3 trials of live view focusing on each. The best results for each lens are shown as green triangles.

Then I took one of those lenses (mounted to the testing bench by its tripod ring) and repeated the series on 11 of the new camera bodies. The blue diamonds and red boxes this time each represent a different camera on the same lens. (4 test shots were taken with each camera, and while the best is used, each camera’s four shots were almost identical.) Obviously the same lens on a different body behaves a little differently.

A group of Canon 70-200 f2.8 IS II lenses tested on one body (green triangles) and one of those lenses tested on 11 brand new Canon 5DII bodies (red squares and blue diamonds).

I separated the cameras into two sets because we received cameras from two different serial number series on this day. I don’t know that conclusions are warranted from this small number, but I found the difference intriguing. And maybe worth some further investigation.


Notice I don’t say conclusion, because this little post isn’t intended to conclude anything. It simply serves as an illustration showing visually what we all (or at least most of us) already know:

  • Put different copies of the same lens on a single camera and each will vary a bit in resolution.
  • Put different copies of the same camera on a single lens and each will vary a bit in resolution.
  • Truly bad lenses aren’t a little softer, they are way softer.
  • Autofocus isn’t as accurate as live view focus, at least when the camera has not been autofocus microadjusted to the lens.

All of this needs to be put in perspective, however. If you go back to the first two charts, you’ll notice the bad copies are far different than the shotgun pattern shown by all the good copies. And when we looked at those two bad copies, we had to look fairly carefully (looking at 50% jpgs on the monitor) to see they were bad.

The variation among “good copies” could probably be detected by some pixel peeping. For example if you examined the images shot by the best and worst Canon 100 f2.8 IS L lenses you could probably see a bit of difference if you looked at the images side-by-side (the images I took on my test camera). But if I handed you the two lenses and you put them on your camera, they’d behave slightly differently and the results would be different.

So for those of you who spend your time worried about getting “the sharpest possible lens”, unfortunately sharpness is rather a fuzzy concept.

Roger Cicala


October, 2011



Matt’s comment made me realize I hadn’t talked about one obvious variable in this little post: how much of the variation is caused by the fact that these are rental lenses that have been used? The answer (at least for Canon prime lenses) is not much, if at all. For example the graph below compares a set of brand new Canon 35mm f/1.4 lenses tested the day we received them (red boxes) to a set taken off of the rental shelves (blue diamonds).

Comparison of stock 35mm f/1.4 lenses with new-from-box copies

Please note I make this statement only for Canon prime lenses. Zooms are more complex and I see at least one zoom lens that doesn’t seem to be aging well, but until I get more complete numbers to confirm what I think I’m seeing I won’t say more. I see no reason to expect other brands to be different, but at this point we’ve only been able to test Canon lenses (these tests are pretty time consuming and we have a lot of lenses).


Author: Roger Cicala

I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.

Posted in Lenses and Optics
  • Roger Cicala

    We haven’t seen any difference in recent lens and older lens, but we don’t have any well to tell what lenses have been made post tosunami.

  • I am impressed Roger!! This explains why I have always rented my lenses from you. Always reliable and always professional and friendly. Keep up the great work!!!

  • Roger Cicala

    Sandra – Canon lenses used for the example, but it is pretty similar with all brands.

  • R

    Merrick, Live View was manual focus – so he could magnify to get sharper manual focus than with an optical viewfinder. The article does not test the difference between phase detect autofocus and contrast detect autofocus. I think phase detect would be more accurate, but it could vary from model to model.

  • Sandra

    Hey Roger Help Me…Problem in canon Lenses?…Not in Nikon?…should I switch to Nikon?

  • Lynn Allan

    Good article.

    I’d be interested in charts/graphs where you took an average performing combination of lens+camera, and used micro-focus-adjustment to optimize. My impression is that the combination should improve to a greater or lesser degree, and that there would be more uniformity.

    Or not?

  • Stas

    Thanks for the excellent article! I just miss one thing – a test ran on different samples, but with manual focussing. I noted that the pictures taken with manual focus form a more tight group and stay well above the ones that are taken with autofocus on the same lens. If You do 4-5 shots with MF on each lens, choose the best sample and make a graph that should be closer to lens difference inspection. In continuing Your point on having too much variables we should list not only lens, and AF, but rather disassemble them to body-to-lens protocol, autofocus optics, autofocus in camera algorithm, meaning that even body-to-body with the perfect lens would produce a cloud result.

  • Basco

    very good review, thk you for sharing it with the rest of us, keep up the good work.

  • Roger, thank you for the very informative article.
    Do you know, that Canon does have very sophisticated calibration software,
    which allows to calibrate all and each autofocus sensor, and for zoom lenses even to specify
    different MFAs, depending on zoom value?
    Also, different autofocus sensors in one camera can be misaligned (this results in different focus results when camera focuses by different focus points) and that Canon software
    allows to align them evenly. MFAs can be recorded for a specific camera/lens combo, based on lens internal ID, not lens name which is like standard MFA in camera works.
    I saw how service engineers calibrated Canon 40d + Canon 50L combo and I wished I had that software, it is really not hard to use and very powerful. Unfortunately, it is not public, it is provided only for authorized Canon Service centers. If Canon provided this software for public use (it even has good enough help documents), all that problems would be gone, I think. With that software, almost any photographer can test autofocus and find, for instance, that center horizontal autofocus sensor should be adjusted and other censors should be not, which was the case of that 40D+50L combo. Standard in camera MFA does not allow to calibrate one autofocus sensor only, it does it for all focus points at once. If it is new info for you and you have any questions about how Canon service calibrates lenses, just send me a message.

  • Pingback: How Much Do Copies of the Same Lens Differ From One Another? · NEWS on the Dreamspace()

  • Andre

    Seriously, do you expect to get any accurate focus results using those bodies?

    Why not try the same methods with a brand of camera that is reliable in the focus dept.

  • Voe

    This is why Leica lenses are so expensive. Because there is no sample variation. Everything is built and tested to the highest standards.

  • WOW … yet another great article Roger – super job again. BTW, you say

    “The variation among “good copies” could probably be detected by some pixel peeping. For example if you examined the images shot by the best and worst Canon 100 f2.8 IS L lenses you could probably see a bit of difference if you looked at the images side-by-side (the images I took on my test camera).”

    You might consider putting those two images side-by-side … i.e. show us graphically how (little?) difference there is. Plus toss up the bad copy. Obviously full-res for all of ’em.

    The suggestion to do error bars is a good one … although adds another dimension so lots more testing. The 3rd figure down (suggestion: number your figures/graphs) titled “Copy 7, repeatedly autofocused (blue diamonds) and manually focused (green triangles)” does exactly this. Encouraging that the spread is pretty tight to my untrained eyes.

    It’s late at night, but I’m assuming that “autofocused” meant using phase AF and “manually focused” actually means use contrast AF (rather than Mark I eyeball and manually rotating the focus ring) – if I’m wrong, ignore the next paragraph.

    The data shows that contrast-AF squeezes the best performance out of the lens which I don’t find surprising. Although I can’t help but wonder if a little microfocus adjust might have helped here … i.e. how much would a +-1 have moved those phase-AF data points?

  • mantra

    nice article
    may i ask a question ?
    do the earthquake & tsunami affect the built and control quality of canon lenses ?


  • It is interesting, when Sony bought Minolta’s camera division, one of the Sony directors said that he was really surprised with how manual the whole manufacturing process is and at how many levels things can go wrong with current AF system. Back then he claimed that Sony will tackle that. Now with SLT and NEX series we do see the changing landscape after a long time but the question is is it really better, can we count on more consistence results across camera/lens system. Any experience with SLT camera/lens combo?

  • James

    Excellent, excellent article. Your description of those who wanted the top 3 copies of a particular lense reminds me of how at Steinway Piano’s hq in NYC, there’s a special room where the world’s great pianists get to hand pick the top pianos (out of several $100,000+ pianos!) to get the one that’s sounds just ‘right’.

    When I first read about Fujifilm’s choice of a non-interchangeable fixed lens in their x100, they justified it by stating that the camera’s sensor was customized just for that lense in order to allow light to strike the sensor at an angle that is as perpendicular as possible (more important for digital sensors than film sensors). Were Fujifilm’s engineers possible trying to avoid the performance variations that might exist with interchangeable lenses as described in your article?

    I’d love to learn more about this topic, especially one of the above comments’ thoughts on how contrast detection autofocus systems might minimize the issues described in your piece. Again, thank you for an excellent article.

  • Bob

    If by “not a scientific study” you mean the results are meaningless, I agree. But you seem well intentioned, and I hope you understand that I’m not nitpicking here.

    You could create basic error bars by measuring the same copy of the same lens, same setup, multiple times and then calculating the standard deviation of your measurements.

  • David

    Love the articles. I don’t suppose you’ll take a stab at variation among zoom lenses? Could we assume that a given lens might be sharper at some, but not all focals, or might we expect a lesser copy to be uniformly worse?

    Another of the more interesting attributes you can discover with Imatest is T-stop transmission. The variability here among ostensibly similar lenses would make for a nice graph.

  • Roger Cicala

    Allan, that’s a great idea. I’ll give that a try.

  • Roger Cicala


    I still miss my 5D Classic. I loved that camera till I ran the shutter out. But I agree completely with your assessment: new algorithms probably change the way a third party lens behaves, but the manufacturer probably makes a point to include all of their own lenses in the new setup.

  • Roger Cicala


    I think there’s a lot of truth in that. One thing that impressed me was that at a glance, the really bad lenses shown in the graphs could be detected if we looked at the images on a monitor, but you had to look rather carefully. I think of the general group of lenses, it would take some serious pixel-peeping or lab type measurement to see the difference.


  • Roger Cicala


    Our experience is that usually if we just send a lens in saying it’s soft, all we get is an electrical adjustment and “lens in spec” reply. If we describe the problem carefully, like “lens is softer on the left side when focusing at 8-20 feet, less so at infinity” we tend to get a good repair.

  • Roger Cicala

    I don’t have the figures at hand, but if you look at the graph in the addendum the new lenses were tested in different days (with resetup) than the older lenses. It took a month of developing our techniques to get reproducible results but once we had the proper support and alignment equipment in place the results are very reproducible session to session.


  • Roger Cicala

    We don’t right now, we just don’t have enough staff during busy season, but we may offer it as a service during the winter.

  • Roger Cicala

    It was outside the main point of this article, but when we run a batch of lenses through on autofocus on a given camera there are always a few that are badly back or frontfocused consistently, and those would definitely be corrected by micro adjustment.

  • Roger Cicala

    This is just a quick demonstration, not a scientific study. However, I’m not aware of a method of placing error bars on actual data points.

  • Bob

    Without error bars, it is impossible to tell if any of the results are statistically significant.

  • cj

    It would be interesting to see how the EXIF focus distance compares to the real distance and sharpness in this study.

  • Roger Cicala


    I can’t say for certain, gut my gut feeling is there would still be variation, although much less so.

  • Roger Cicala

    Darrill, I believe that to be true – at least it is in my hands. I know temperature and humidity can have a very real effect, but I think there’s other variables, too.

Follow on Feedly