LensRentals.com

Notes on Lens and Camera Variation

Posted by

A funny thing happened when I opened Lensrentals and started getting 6 or 10 copies of each lens: I found out they weren't all the same. Not quite. And each of those copies behaved a bit different on different cameras. I wrote a couple of articles about this: This Lens is Soft and Other Myths talked about the fact that autofocus microadjustment would eliminate a lot, but not all, of the camera-to-camera variation for a give lens. This Lens is Soft and Other Facts talked about the inevitable variation in mass producing any product including cameras and lenses: that there must be some real difference between any two copies of the same lens or camera.

A lot of experienced photographers and reviewers noted the same things and while we all talked about it, it was difficult to use words and descriptions to demonstrate the issue.

And Then Came Imatest

We've always had a staff of excellent technicians that optically test every camera and lens between every rental. But optical testing has limitations: it's done by humans and involves judgement calls. So after we moved and had sufficient room, I spent a couple of months investigating, buying, and setting up a computerized system to allow us to test more accurately. We decided the Imatest package best met our needs and I've spent most of the last two months setting up and calibrating our system (Thank you to the folks at Imatest and SLRGear.com for their invaluable help).

It has already proven successful for us, as it is more sensitive and reproducible than human inspection. We now find some lenses that aren't quite right, but that were perhaps close enough to slip past optical inspection. Plus the computer doesn't get headaches and eyestrain from looking at images for 8 to 10 hours a day.

Computerized testing has also give me an opportunity to demonstrate the amount of variation between different copies of lenses and cameras. We have dozens (in in some cases dozens of dozens) of copies of each lens and camera. While we don't perform the multiple, critically exact measurements that a lens reviewer does on a single copy, performing the basic tests we do on multiple copies demonstrates variation pretty well.

Lens-to-Lens Variation

We know from experience that if we mount multiple copies of a given lens on one camera, each one is a bit different. One lens may front focus a bit, another back focus. One may seem a bit sharper close up, another is a bit sharper at infinity. But most are perfectly acceptable (meaning the variation between different copies is a lot smaller than the variation you're likely to detect in a print). I can tell you that, but showing you is more effective.

Here's a good illustration, a run of 3 different 100mm lenses, all of which are known to be quite sharp: the original Canon 100mm f/2.8 Macro, the newer Canon 100mm f/2.8 IS L Macro, and the Zeiss ZE 100mm Makro. The charts shows the highest resolution (at the center of the lens) across the horizontal axis, and the weighted average resolution of the entire lens on the vertical axis, measured in line pairs / image height. All were taken on the same camera body and the best of several measurements for each lens copy is the one graphed.

Resolution of multiple copies of several 100mm lenses

It's pretty obvious from the image there is variation among the different copies of each lens type. I chose this focal length because there was a bad lens in this group, so you can see how different a bad lens looks compared to the normal variation of good lenses. As an aside, the bad lens didn't look nearly as bad as you would think: if I posted a small JPG taken with it, you couldn't tell the difference between it and the others. Blown up to 50% in Photoshop, though, the difference was readily apparent.

My point, though, is while the Canon 100mm f/2.8 IS L lens is a bit sharper than the other two on average, not every copy is. If someone was doing a careful comparative review there's a fair chance they could get a copy that wasn't any sharper than the other two lenses. I think this explains why two careful reviewers may have slightly different opinions on a given lens. (Not, as I see all too often claimed on various forums, because one of them is being paid by one company or another. Every reviewer I know is meticulously honest.)

Autofocus Variation

We all know camera autofocus isn't quite as exact as we wish. (Personally, after investigating how autofocus works for this article, I'm amazed that it's as good as it is, but I still complain about it as much as you do.) But when I started setting up our testing, I was hoping we could use autofocus to at least screen lenses initially. The results were rather interesting. Below is the same type of graph for a set of Canon 85mm f/1.8 lenses I tested using autofocus. Notice I again included a bad copy as a control.

 

Test run of a dozen Canon 85mm f1.8 lenses (and one known soft copy) using autofocus

(For those of you who are out there thinking "I want one of those top 3 copies, not one of the other ones", and I know some of you are, keep reading.)

Then I selected one copy that had average results (Copy 7), mounted it to the test camera, and took 12 consecutive autofocus shots with it. Between each shot I'd either manually turn the focus ring to one extreme or the other, or turn the camera off and on, but nothing else was moved. (By the way, for testing the camera is rigidly mounted to a tripod head, mirror lock up used, etc.)

In the graph below, overlaid on the original graph, the dark blue diamond shapes are the 12 autofocus results from one lens on one camera. Then I took 6 more shots, using live view 10x manual focus instead of autofocus, again spinning the focus dial between each shot. The MF shots are the green diamonds. I should also mention that when I take multiple shots without refocusing the results are nearly identical - that would be a dozen blue triangles all touching each other. What you're seeing is not a variation in the testing setup, it's variation in the focus.

Copy 7, repeatedly autofocused (blue diamonds) and manually focused (green triangles)

It's pretty obvious that the spread of sharpness of one lens focused many times is pretty similar to the spread of sharpness of all the different copies tested once each. It's also obvious that live view manual focus was more accurate and reproducible than autofocus. Of course, that's with 10X live view, a still target, and a nice star chart to focus on and all the time in the world to focus correctly. No surprise there, we've always known live view focusing was more accurate than autofocus.

One aside on the autofocus topic: Because it would be much quicker for testing, I tried the manual versus autofocus comparison on a number of lenses. I won't bore you with 10 more charts but what I found was that older lens designs (like the 85 f/1.8 above) and third party lenses had more autofocus variation. Newer lens designs, like the 100mm IS L had less autofocus variation (on 5DII bodies, at least - this might not apply to other bodies).

Oh, and back to the people who wanted one of the top 3 copies: when I tested two of those repeatedly, I never again got numbers quite as good as those first numbers shown on the graph. The repeated images (including manual focus) were more towards the center of the range, although they did stay in the top half of the range, at least on this camera, which provides me an exceptionally skillful segue into the next section. (My old English professor would be proud. Not of my writing skills, but simply that I used segue in a sentence.)

Camera to Camera Variation

Well, we've looked at different lenses on one camera body, but what happens if we use one lens and change camera bodies? I had a great chance to test that when we got a shipment of a dozen new Canon 5D Mark II cameras in. First, I tested a batch of Canon 70-200 f2.8 IS II lenses on one camera, using 3 trials of live view focusing on each. The best results for each lens are shown as green triangles.

Then I took one of those lenses (mounted to the testing bench by its tripod ring) and repeated the series on 11 of the new camera bodies. The blue diamonds and red boxes this time each represent a different camera on the same lens. (4 test shots were taken with each camera, and while the best is used, each camera's four shots were almost identical.) Obviously the same lens on a different body behaves a little differently.

A group of Canon 70-200 f2.8 IS II lenses tested on one body (green triangles) and one of those lenses tested on 11 brand new Canon 5DII bodies (red squares and blue diamonds).

I separated the cameras into two sets because we received cameras from two different serial number series on this day. I don't know that conclusions are warranted from this small number, but I found the difference intriguing. And maybe worth some further investigation.

Summary

Notice I don't say conclusion, because this little post isn't intended to conclude anything. It simply serves as an illustration showing visually what we all (or at least most of us) already know:

  • Put different copies of the same lens on a single camera and each will vary a bit in resolution.
  • Put different copies of the same camera on a single lens and each will vary a bit in resolution.
  • Truly bad lenses aren't a little softer, they are way softer.
  • Autofocus isn't as accurate as live view focus, at least when the camera has not been autofocus microadjusted to the lens.

All of this needs to be put in perspective, however. If you go back to the first two charts, you'll notice the bad copies are far different than the shotgun pattern shown by all the good copies. And when we looked at those two bad copies, we had to look fairly carefully (looking at 50% jpgs on the monitor) to see they were bad.

The variation among "good copies" could probably be detected by some pixel peeping. For example if you examined the images shot by the best and worst Canon 100 f2.8 IS L lenses you could probably see a bit of difference if you looked at the images side-by-side (the images I took on my test camera). But if I handed you the two lenses and you put them on your camera, they'd behave slightly differently and the results would be different.

So for those of you who spend your time worried about getting "the sharpest possible lens", unfortunately sharpness is rather a fuzzy concept.

Roger Cicala

Lensrentals.com

October, 2011

 

Addendum:

Matt's comment made me realize I hadn't talked about one obvious variable in this little post: how much of the variation is caused by the fact that these are rental lenses that have been used? The answer (at least for Canon prime lenses) is not much, if at all. For example the graph below compares a set of brand new Canon 35mm f/1.4 lenses tested the day we received them (red boxes) to a set taken off of the rental shelves (blue diamonds).

Comparison of stock 35mm f/1.4 lenses with new-from-box copies

Please note I make this statement only for Canon prime lenses. Zooms are more complex and I see at least one zoom lens that doesn't seem to be aging well, but until I get more complete numbers to confirm what I think I'm seeing I won't say more. I see no reason to expect other brands to be different, but at this point we've only been able to test Canon lenses (these tests are pretty time consuming and we have a lot of lenses).

 

106 Responses to “Notes on Lens and Camera Variation”

Basco said:

very good review, thk you for sharing it with the rest of us, keep up the good work.

Stas said:

Thanks for the excellent article! I just miss one thing - a test ran on different samples, but with manual focussing. I noted that the pictures taken with manual focus form a more tight group and stay well above the ones that are taken with autofocus on the same lens. If You do 4-5 shots with MF on each lens, choose the best sample and make a graph that should be closer to lens difference inspection. In continuing Your point on having too much variables we should list not only lens, and AF, but rather disassemble them to body-to-lens protocol, autofocus optics, autofocus in camera algorithm, meaning that even body-to-body with the perfect lens would produce a cloud result.

Lynn Allan said:

Good article.

I'd be interested in charts/graphs where you took an average performing combination of lens+camera, and used micro-focus-adjustment to optimize. My impression is that the combination should improve to a greater or lesser degree, and that there would be more uniformity.

Or not?

Sandra said:

Hey Roger Help Me...Problem in canon Lenses?...Not in Nikon?...should I switch to Nikon?

R said:

Merrick, Live View was manual focus - so he could magnify to get sharper manual focus than with an optical viewfinder. The article does not test the difference between phase detect autofocus and contrast detect autofocus. I think phase detect would be more accurate, but it could vary from model to model.

LensRentals Employee

Roger Cicala said:

Sandra - Canon lenses used for the example, but it is pretty similar with all brands.
Roger

AJ Borromeo said:

I am impressed Roger!! This explains why I have always rented my lenses from you. Always reliable and always professional and friendly. Keep up the great work!!!

LensRentals Employee

Roger Cicala said:

We haven't seen any difference in recent lens and older lens, but we don't have any well to tell what lenses have been made post tosunami.
Roger

LensRentals Employee

Roger Cicala said:

Oleg,
I know of the factory calibration software, but this is a very pertinent point. It can do a much more thorough job than we can with micro adjustment. My understanding is requires specific test targets and the setup is too expensive for smaller repair shops to afford, but I totally agree: it would be an awesome tool to have available.

Bart said:

Hi Roger,

I really love your articles, and especially the position you take; no conclusions, no big statements, just proper testing and modest judgements. Anyhow, I have a question; Can you think of a reason why manufacturers won't let the CDAF (Contrast Detect AutoFocus) be used to calibrate the PDAF (Phase Detect AutoFocus)? I could imagine mounting a new lens, focusing with CDAF on a test-chart at different distances, with which the PDAF is calibrated. This would combine the pro of CDAF, of focusing on the image-plane, with the speed of the PDAF.

Kind regards en thanks for your great articles!
Bart

Oleg said:

Roger,

Regarding Canon calibration software:
Yes, service uses special setup, but feature-wise it is basically
vertical and horizontal focus targets with a ruler to measure focus error.
First, they focus on targets, then replace it with ruler and make a shot.
After that software analyzes the contrast to determine area of best focus.
After that adjustments are made.
So, this special equipment can be replaced by focus target (vertical of horizontal lines printer on paper) and a ruler. It will work slower than special equipment but it will work. The only problem is software.
Canon is very serious about it. Installation is tied to a machine, it will not work on another machine and Canon manages keys for each installation. That is why that software is not well-known. It can run only on authorized machines.

Guthrie said:

Another fantastic article! Thanks for keeping the information flowing, and public! You do a great service to the community.

Will this be making it's way over to canonrumors?

I'm curious as to which zoom you need "more complete numbers" to talk about. Should be a good read, I'm sure!

Keep up the great work,

Cheers,

LensRentals Employee

Roger Cicala said:

Bart,
I can't think of any good reason they don't do that. It seems to me it would be a simple software / firmware add on. And would be amazingly useful. I'd also love to be able to flip a switch and get to choose phase detection AF when I needed speed and contrast detection for accuracy.
Roger

alek said:

Ditto what Bart and Roger said about using contrast-AF to auto-microadjust phase-AF ... surprising/disappointing that the manufacturers haven't provided this capability.

P.S. Any chance Roger of showing us some full-res images that better show the real-world impact of focusing/lens variability?

Udi said:

Roger,

Your tests are very interesting. You have tested a set of identical lenses against one body, and then a single lens against multiple identical bodies.

My question is - Can you actually use this to select a "good" body and a "good" lens, or is this a question of a combination of a specific body and lens.
Asked otherwise - if you will run the same groups of lens against a different camera body than the one you just did - will the top performing lens remain the same one, or will we get a lens that wasn't so good in the previous test to become a much better lens because the specific body and lens cancel out each other's errors?

Les Burns said:

Great article.
I remember years ago, at a large camera store in Chicago, they offered special selections of new Goerz gold dot lenses (supposedly the best) for a premium; I always wonderd if it was worth the money; at least I suppose it kept you from getting a dog.
When I ordered M300 Nikors for my Gowlandflex (a 4x5 tlr), Olden got them to match the focal lengths tightly. Judging from what you said about Canon's focus checking, it doesn't seem as if focal length should be a factor in lens quality variation.
I also remember hearing Leitz had a series of cams (or maybe an adjustment) to match variation in focal length on their rangefinder lenses.

Chris said:

One of the most most useful, informative articles I have ever read! Thank you sooo much for this. :)

John Dunne said:

A great read, thanks for being a beacon of sanity :-)

eric said:

Geesh, more people stealing my idea to use contrast deection to calibrate phase detection! Just kididng, but I posted that idea many months ago. Canon may have simply not thought of doing so....they could even have a firmware routine that almost totally automates the process (user would be promted to change focal lengths).

John Hartigan said:

I love this article! Best I have read on any photography tech subject in years. Before CAD and CAM, when cameras and lenses were designed and manufactured by craftsmen & not computers, We said the difference between a 60 dollar lens and a 3000 dollar lens was how many ended up on the trash pile instead of on store shelves. A great lens was rigorously checked before release, which is probably still done in the Cine world.
Thank you Roger

George said:

Thanks for this article... a great read. Maybe I missed this in the article but were calibrations preformed on the cameras before the tests? I guess my root question is around the thought of... are these variations after calibration or out of the box variation?

tlinn said:

Hey Roger,

Just another thank you for a great article. You are really in a unique position to provide insights like this and I certainly appreciate your doing so.

Tim

LensRentals Employee

Roger Cicala said:

George,

Those data points are random: lenses are not microfocus adjusted to the test camera. Our goal is to get sort of a worst case scenario since we're trying to see how they'll work for our customers.

Roger

LensRentals Employee

Roger Cicala said:

Udi,

It's easy to find a good-camera lens match, but the lens that's sharpest on camera A is not likely to be the sharpest on Camera B. If I run the same batch of lenses on another camera (which we've done) the overall data points will be similar, but the lenses will shift around within the grouping.

Roger

J. Skinner said:

Great article. Written better than most professional engineers could do.
I really liked your filter tests too.

I understand your graphs, but it takes some mental extrapolation to
realize how tightly grouped the data points are.
If one of your graphs went to zero-zero on the 2 axes it would appear
to the casual reader that there is much less variation between the
best and worst lenses and they would be less likely to send one back
the the seller.

Marc Beckwith said:

An outstanding article. I especially appreciate the lack of sweeping generalizations, allowing your research to speak for itself. You've provided a great service to the pixel peepers amongst us!

John Jovic said:

Nice job Roger, keep it up. You're in a special position to be able to do this kind of testing and of course you don't have to share your efforts with any one so it's much appreciated that you do.

JJ

LensRentals Employee

Roger Cicala said:

That's a very good point on the graphs: I was zooming the axis a bit to emphasize the difference, but probably should have started with a zero axis graph to show how tight the group really is.

Thanks,
Roger

Steve said:

Great work on a subject that has been a thorn in my side for many years. I tend to shoot wide open with very fast lenses and have been tortured by AF misses. I think it's a great argument for digital viewfinders, which I believe, is the best way to achieve consistent correct focus.

Philip said:

Great article. Really good to see the variability of AF exposed like this. I had a 70-200 2.8L I sold earlier in the year as I was never happy with it's focussing. It went back to Canon CPS 3 times but they could never find anything wrong with it. Bought a MKII and it's great. Go figure!

John Kennekam said:

Many cameras now have fine-tuning as an option, but the whole process is labourious. Why not simply automate the process for the customer. It could work as follows:

1. The camera and lens is mounted on a tripod.
2. Focus pattern sheet is downloaded from the vendor's web site and is printed out.
3. The camera is pointed at the pattern sheet.
4. No the magic happens. The camera automatically takes a series of images ranging from -20 to +20.
5. The camera analyses which shot is the sharpest (using existing AF software) and and automatically save the setting for that lens.

The above process is not new, it is just that the user must o steps 4 and 5 manually which takes forever if you have a lot of lenses.

jamesm007 said:

Oleg

That's how the Pentax service center tunes the focus on bodies as well. They don't use charts. They put the camera on a special machine turn screws on the bottom until the PCs monitor which is loaded with a special program tells the tech AF is good.

Just a few months ago when my K20D had almost 50,000 clicks on it, the rear edial started to act up and I sent it to Pentax Service in the USA. The tech checks all specs while on his bench before the camera comes back to me. He found AF out of whack and adjusted it.

Now you must understand I thought my AF on the body was perfect. I would nail BIF like they were standing still. No problems with AF fine tuning my lens. Although some lens like the kit needed +8 to be decent. So where I stood my K20Ds AF was perfect.

When I got it back and took pics with all my lens first thing I notice is the kit lens is better. So I start my very tedious procedure of setting AF fine tuning. Its hard because as the article stated their is AF variance even under perfect conditions. It took almost a week for me to set the AF fine tuning on my DA55-300mm (I do work).

With AF tuning done, the kit lens now needed 0 AF fine tuning; before it needed +8. The DA55-300mm needed +5; before it needed 0. Those stand out to me. The others were about the same. The IQ of all lens stayed the same from what I can see except the kit (DA18-55mm WR) which got better.

My conclusion is first time buyers are in for a long road of learning and that these articles should be a must read and for the newbie to take them to heart, and the maintenance schedule in the owners manual should be looked at at least. Many don't even know dSLRs are suppose to have maintenance.

I hope the author here can one day speak on maintaining dSLRs and if AF does fall out of whack over time. From his unique standpoint.

Ros said:

I get much more consistent AF and hits with my GH1 CDAF then i did with my 40D,
i dont shoot action, so for me i'll never going back to PDAF, especially after all the trips to the lab for AF calibration (lens and body's)

Thanx for the article, keep em going!

DaveD said:

What is the rate of 'bad copy' lenses?

I'm not overly concerned about the minor variations between the good copy lenses. But as a lone consumer, how do we decide if we have purchased one of those 'bad copies?' A little worrisome that you have 2 of those in this relatively small sample.

In your experience, just how rare are the bad copies? Would a casual comparison with one's other lenses detect one of these bad apples?

LensRentals Employee

Roger Cicala said:

Dave,

Remember, with these samples I purposely included some that had failed our standard inspection process (optical test-charts, etc.) so we would get some perspective on the much bigger difference between a bad lens and just sample variation among normal lenses. Bad copies, out of the box, are rare, but they happen. Maybe 1%, but some variation depending on brand, if it's a newly released type of lens, is it a zoom, does it have IS, etc. We're going to have more bad copies than that because our lenses are used heavily: a good lens that gets dropped becomes a bad lens most of the time.

Roger

Jack C said:

Nice article... the charts summarize the information very well.

I would be very interested to see a similarly in-depth analysis of how Contrast Detect AF compares with Phase Detect AF in terms of accuracy and consistency of results

-JC

Robert S said:

Roger,

Thanks for a helpful article. How often should one send their lenses in for service? Recently, I noticed softness in some images and thought I needed to buy primes. I sent my 24-70 mm zoom and 5D Mii back to Canon and they came back noticably improved. Canon's note that they repaired the mechanical linkage reminded me, as several writers above did, to follow the camera's maintenance schedule?

LensRentals Employee

Roger Cicala said:

Robert,

I don't have a blanket answer, but you are providing me a nice intro to an article I'm working on now: looking at how various lenses perform in relation to their age. It's interesting because the one lens we see deteriorating over time is the 24-70 f2.8: the zoom linkage / seals tend to wear out. I assume (but don't know for certain) it has to do with a heavy internal barrel that causes wear and tear with long=term use.

dan said:

I personally feel we have been duped by Canon L lenses. Almost every
good nature photograph I have seen has been using a tripod.
I question the need for IS over any lens over 300mm, and the only reason the 300mm 2.8 is is so sought after is because Bob Atkins published his unscientific study for years. Canon should be paying Bob Atkins royalties.

I don't blame Bob, but his gullible readers are another thing. Maybe
it's human nature. The 300mm 2.8 seemed like the perfect lens.
It used to be reasonably priced--not anymore.

consumers won't buy the most expensive lense, or the least expensive-
it's usually the one in the middle. The 300mm 2.8 was that lens--until
the internet.

Oh yea, people B and H is Not the only store in the world.

When I saw the price of the 300mm 2.8 zoom from $4200.00 to
close t0 8 grand; I started to boycott the company. It's pure greed.

dan said:

If your new to nature photography--remember this; there's a reason they give you a nice box on big canon lenses. That's where they stay most of the time. Buy a long prime and a tripod and get outside and take pictures.

Too many photographers scour the internet looking for a technical edge,
and really don't take many pictures. I used to be one of those boobs.

Thomas Andre said:

This isn't the same issue, but is related. I have a Tamron 18-270 pzd and Canon 55-250 lens (for digital smaller sensors). At 18mm the Tamron matches, in size of image, my 18mm-55mm Canon kit lens (for a 20D). At full extension, however, the Canon at 250mm produces a more magnifed image than the Tamron at 270 mm. Have you every check the magnification accuracy?

LensRentals Employee

Roger Cicala said:

Thomas, we do check that and all of the zooms (and a few primes) vary a bit in the actual, versus stated, focal length. For example the Canon 24-70 zoom is actually 25mm to 67mm, the 70-200 is 74mm to 195mm, and the Sigma 50-500 actually is a 55mm to 470mm. I think the rule of thumb is they should be within plus or minus 5% but I'm not certain that's always true.

John Angulo said:

For these plots, how do you compute the average resolution of the entire lens? You call it a "weighted average" - how is it weighted? In a simple unweighted average of multiple measurements across the image, a fuzzy corner due to decentering might not have much influence on the result, even if some of the measurements capture it. But a fuzzy corner or side is just what makes the difference between good and bad copies for critical purposes. Would it be possible to plot the lowest measured resolution on the vertical axis, whevever on the image that occurred? It might be interesting to see the scatter among different copies in this case.

LensRentals Employee

Roger Cicala said:

Hi John,

The weighted average uses Center point x 1, 6 mid points (which include the top and bottom center edges) x 0.75, and corners and lateral edges x 0.5. On bad corner does drag the weighted average down significantly, but more importantly we hardly ever see one bad corner: usually it's a bad side (or top/bottom) and occasionally it's contralateral corners both bad (certain kinds of tilt can cause it) and in such cases the weighted average is awful. But as far as our testing we flunk far more on the weighted average than on center sharpness. Not surprising, really, when you think about how much more a tilt or decentering will affect the outer area of the lens.

Roger

Frans van den Bergh said:

If you want to evaluate the sharpness of your own lenses, or you wish to calibrate the AF fine tuning of your DSLR body, you should try MTF Mapper

http://sourceforge.net/projects/mtfmapper/

This utility is totally free, and source code is provided. Or you could buy Imatest for $300-$5000 :)

Please read the user documentation thoroughly, though.

I would appreciate any feedback, and welcome all discussion!

Hendrik said:

I second Jack C on
"interested to see a similarly in-depth analysis of how Contrast Detect AF compares with Phase Detect AF in terms of accuracy and consistency of results".
Please.

Scott said:

Great article, Roger.

Trying to get a sense of how much variability is represented by the cloud in your first graph, I have a question:
Suppose I'm walking around my city in daylight, taking actual pictures. I mostly have the lens stopped down a bit, and I pay attention to using appropriate shutter speeds. Is there enough difference between the best and worst results in the cloud (not including the failed lens) to be consistently visible in (say) an 11 x 14 print?

David Tombs said:

Great article, I am sure you can answer my question on lenses.
I am looking for a prime lenese for my Canon - about 85mm.
I note from the regular data that the same FL lenes range from £650 to over £1200, the latter being the L series.

Q. Why the difference in price and does it really make a difference that can be seen in the finished photo or is iot just about a stop of speed?

Thanks

LensRentals Employee

Roger Cicala said:

David,

It depends what the finished photo is (difference is readily apparent in a large print, hardly apparent on a web jpg.). But it's about the stop of light and the ability to blur the out of focus areas mostly. Increasing vision is increasingly expensive - an f/1.2 lens is a very special, and expensive thing. But most photographers don't need it.

Test123 said:

I'd just like to let u know how much I learnt from your blog Dugg you.Hope 2 be back fast for some more good stuff

Leave a Reply