Here’s Roger’s Law of Online Photography forums: “those who seek anonymity need it.” My online name on every forum is either RCicala or RogerCicala. If I say something stupid on one, you can not only call me out on it there, you can send me an email here. Many people have done so over the years. On the other hand, when someone has an online name like PhotoGuruGod (name changed to protect the guilty), I generally figure there’s a reason they don’t want anyone to know their real name. Their posts often follow the “often wrong, never in doubt”, or “don’t let the facts interfere with what you know” schools of reasoning.
Anyway, I read a thread that started innocently enough, a member was asking why two different lens review sites had very different opinions about a lens he was considering buying. I actually know the reasons— there were two in this specific case— but before I could reply, PhotoGuruGod posted that “the manufacturers pay big money for those reviews, the reviewers just say what the manufacturers tell them to”. It’s not the stupidest thing I’ve read online, but its probably Top 10 material. I don’t do lens reviews for a living, but I’m involved peripherally with a number of people who do. While I would have enjoyed squashing PhotoGuruGod’s online paranoid rant, I thought it best to let it die a natural death and instead post some facts about lens reviews, their limitations and usefulness, and maybe a few guidelines about finding and using them.
One thing first: I’m a firm believer in full disclosure. LensRentals.com pays a fee for links from The-Digital-Picture.com and DigLloyd.com, and we loan test lenses to several other reviewers. In all cases, we found them, admired their work, and asked to be associated with them. The amount of money involved and referrals received is pretty small, accounting for less than 2% of our business. What we really benefit from is the ability to ask them questions. I learn a lot from all of them; they are very knowledgeable people. In any case, I’ve made certain this article doesn’t contain any specific information from any of them that isn’t openly available online.
About Online Review Sites
First, let’s get the BS out of the way. Lens reviewers don’t get paid by the manufacturers. There are photographers who are sponsored by manufacturers: Canon, Nikon, and Sigma all have photographers who they support with equipment or money in exchange for use of the photographer’s name and some photographs in advertising. These relationships are all openly acknowledged. None of those sponsored photographers are lens reviewers, and no lens reviewer I know of is getting money (or even equipment) from a manufacturer.
Professional lens reviewers make their living (at least the part that comes from the review site) by web traffic. The sites sell advertising and link to photography shops, getting a few cents when someone clicks on a banner or a commission on click-through purchases. None of them are supported by a manufacturer. None that I know of (and I know most of them) get free lenses from any manufacturer. Few of them even get loaner equipment from a manufacturer to test. In general, reviewers buy or borrow the lenses they test. Most of reviewers believe the best way to get more web traffic is to provide worthwhile reviews that build their reputation. A very few try to generate lots of controversy, because controversy increases traffic, but actually those few are pretty clear and up front about what they’re doing, so that seems fair to me.
It’s pretty easy to see roughly how much traffic the various review sites generate using any one of the traffic monitoring sites available online.
It's also fairly easy to get a (rougher) idea of how much revenue a site generates using similar online tools. It's inappropriate to get specific here, but I think it's important to realize a lens review site isn't a get-rich-quick scheme. The very busiest ones could generate a 6-figure income, but those are the sites where reviews are part of a larger web presence. The larger pure review sites could generate a few thousand a month, but most generate much less than that. It's no coincidence, and probably for the best, that almost every reviewer also is also a working photographer.
The fact that review sites aren’t remarkably profitable means that every review site I know of is a labor of love run by a gearhead like me— except they have better technique and far more patience. This makes the reviews thorough and careful, but its also ends up being the reason that technical lens review sites are all terminally flawed and that no one site can provide the “right” answer all the time. We’ll get to that later, though. First, let’s consider the various types of review sites and how they are best used.
Types of Reviews
First and foremost when reading an online review, its important to understand what type of review you are reading. I’ve listed some specific characteristics with an example of each, but some review sites overlap into two or more types:
- Technical lens reviews. This type of review site revolves around laboratory findings using lasers, targets, computer programs, etc. to demonstrate mathematically the sharpness, vignetting, distortion, chromatic aberration, etc. The equipment involved is expensive and these sites tend to be quite professional. slrgear.com is a good example.
- Hands-on reviews. These often have a significant technical component, measuring vignetting, distortion or sharpness, but are done by a regular photographer using equipment most of us could get if we were so inclined, not through laser-targeted computer analysis. They also have more comments about the handling and behavior of the lens during real world photography. Sometimes they find things through meticulous hands-on testing that the computers miss. diglloyd.com is a good example of a Hands-on review site.
- Personal reviews are like Hands-on reviews but usually these reviews are just one aspect of an established photographer’s website or blog. Personal reviews may be as technical as a Hands-on review (although they usually aren’t) but the emphasis is on a respected photographer’s opinion of the lens or camera after using it for a few days or weeks in the field or studio during the course of normal work. If a hands-on review is an article in the newspaper, the personal review is a respected editorial—more opinion with less emphasis on documenting the facts, but still very worthwhile. Michael Reichmann’s reviews at The Luminous Landscape are good examples.
- Public summaries These are sites that allow multiple nonprofessional reviewers to post their opinions, usually in a structured format with some form of summary. There are always outliers, but reading several dozen quick real-world takes on a lens by different photographers can be very useful. The review section at Fred Miranda is a good example.
Most sites don’t completely fit into one category. In fact some of the best one’s overlap categories quite a bit. Reviews at The-Digital-Picture.com and DPReview.com, for example, are both technical and hands-on. Photozone.de runs the gamut from technical to personal review.
Critically Reading a Review
Its important when reading a review that you understand what you are reading. Too often people walk away from reading reviews saying “one guy said its great, another okay, and the third thought it wasn’t good at all”. If that’s what you took away from reading reviews, you left a lot of important information behind.
The first thing to consider is “what type of review was it?” In theory, technical reviewers should be largely in agreement with each other, since they’re testing the optics primarily. Hands-on reviewers should be similar too, although their photography background may provide more variation: a landscape photographer has a different take than a studio-based portrait photographer, for example. Personal reviews (and to some extent hands-on reviews) have to be read with the reviewer in mind. Michael Reichmann, for example, forgot more photography than I’ll ever know, but I was flabbergasted one time when he glowingly reviewed a lens I hate. Then I spent some time on his site and realized he has very different priorities than I do: he’s constantly in the field and traveling, so small-and-light is a big plus for him, and much of his photography is about strong shapes and silhouettes which the lens in question does very well with. Its slow autofocus isn’t an issue for him like it is for me.
The second (and perhaps less obvious) consideration is when was the review done, and on what camera. This can make a huge difference, particularly on a lens that has been around a while. Most reviewers date their reviews so the information is easy to get. Why is it important? A lot of reasons:
- Older reviews may be done on less demanding cameras. A lens that performed well on a 6 Megapixel camera may not be able to provide sufficient resolution on a 15 Megapixel camera. One that’s perfectly sharp on a crop sensor may have horribly soft corners on a full frame.
- Older reviews were done to less demanding standards. A very good reviewer I know started with a few test targets a good tripod and a level to measure horizontal. That was the standard at the time. Now he has a specially poured concrete leveling base, laser rangefinders to make sure targets are at perfect right angles, etc. That’s the standard today— but it wasn’t three years ago.
- Older reviews generally used camera autofocus. Camera autofocus adds another variable to the testing process: the review is supposed to test the lens, not the lens-and-camera’s-autofocus combination. Only in the last year or two have reviewers insisted on bracketing manual focus and selecting the best shot, or using Live View focusing, etc.
- Older reviews did not consider camera variability. It’s self-evident that cameras vary in tolerance just like lenses, but until a year or two ago everyone ignored this: the camera was considered perfect and the lens the variable. It’s become very obvious in the last year that a given lens behaves differently on different bodies because the bodies vary too. Manual focus bracketing and Live View focusing help eliminate some of this variation.
So when two reviews differ greatly in their opinion of a lens, it’s important to see if one is much newer than the other, or if they were done on different cameras. It’s also important to read the review site’s methods section. How do they focus? Where did they get the lens? Did they test more than one copy? It’s easy to get a ‘wow’ factor from lots of graphic measurements showing sharpness across the lens, distortion, blur index, etc, but pretty graphics don’t tell you if the lens was tested meticulously, they just demonstrate the numbers plugged into the program in visually appealing fashion. The real question is were the numbers accurate.
The bottom line is reading several reviews should give you a range in which it’s reasonable to expect a lens to perform. It might be as good as the best review you read, or as bad as the worst, but will probably fall somewhere in between. Knowing the reviewer’s point of view and methodology can be as important as the review itself.
The Problem with all Review Sites
I’m convinced that every reviewer is doing the best they possibly can to provide us with accurate information. There are some variations in the equipment and methodology they use, but all are worthwhile. There is one thing, though, that keeps even the best review sites from being truly scientific: sample size. When doing scientific research (one of my previous lifetime experiences) you never study one subject: you study a number of subjects so that you can have an accurate idea of the range of findings. If I want to know how tall 3rd graders are, I don’t measure one 3rd grader and then declare “third graders are 5 feet, 5 inches tall”. I measure dozens (at least) of third graders and find “third graders average 4 feet, 8 inches tall”. I can then use statistical analysis and say “third graders are 4 feet, 8 inches tall on average and 90% of them are between 4 feet, 3 inches and 4 feet 11 inches tall”. That’s a lot more useful information than “one third grader was 5 feet, 5 inches tall”.
if we were going to decide “what has the highest resolution, a Canon 24-70 f/2.8 or a Sigma 24-70 f/2.8” there are first a lot of variables to define. On what camera? Okay, we’ll use a 5DMkII, since it has the highest resolution. Now: at what focal length? 24mm? 50mm? 70mm? Okay, we’ll test at all 3. What aperture: wide open? f/8? f/4? Okay, we’ll test at all 3 focal lengths and 3 apertures. This is going to take a while. And of course we’ll want to test for vignetting, lens flare, autofocus speed, sharpness both in the center and on the edges. We’ve got a lot of measurements to make, and we need to repeat each several times to make sure we’re accurate. Plus its already obvious that we might find one lens better in the center at f/8 while the other has better corners at f/2.8. But still it looks doable.
To be scientific, though, we need to test a number of different copies of each lens: we all know there is sample variation among lenses, so to be accurate we probably should test a half-dozen copies of each lens. But wait: there’s camera-to-lens variation too. We could do all that testing and all we’ve really concluded is one lens is better than the other in the above categories (or some of the above categories) on 5DII camera, SN xxxxxxxxx. Its very possible the findings would be slightly different on another 5DII body, and much more different if we’re testing the Nikon version of a lens on a Nikon body. It’s overwhelming now, isn’t it?
But let’s ignore the body-to-lens variables for now. The reality is review sites typically test one copy of a lens. In the last year, a few sites are making an effort to test 2 or 3 copies of a lens; I know because we help several of them get copies to test. Even then, though, they usually present only the data from the ‘best copy on our test camera’. Remember what I mentioned earlier? A lens review site generates some income, but not enough income to buy 12 copies of a lens for testing, even they did sell them all used later. Not to mention testing 12 copies increases the number of hours spent testing a lens by a factor of 12. I naively have offered to let some reviewers come to LensRentals and test dozens of copies, but then I found out how long it would take and how much equipment they’d have to move. From our side, I can’t take dozens of copies out of stock to send off for reviewer testing. So there you are: stalemate for now. But we’re working on it.
So the bottom line with every review you read online: it describes the behavior of one copy of a lens on one camera. Its a spot of data in a variable landscape. No one review should be the thing that makes you decide for or against a given lens. But several of them should give you a good idea of what you will get, and reading a Public Summary review site should alert you to how much variation there might be.
Sites We Recommend
Recommendations are based on content accuracy, or in some cases, entertainment value (and in no particular order – all are worthwhile).
- The Digital Picture (Canon mount only)
- DigLloyd (some free, but mostly subscription. Very unique site with lenses others don’t cover.)
- Photodo (newly relaunched, was a technical review site, but now is mostly Hands-on)
- 16-9 (specializing in lens-to-lens comparisons, and using lenses on adapters across brands)