Photo

Pay Attention to the Man Behind the Curtain

Published August 4, 2016

 

Metro Goldwyn Mayer

Metro Goldwyn Mayer

A Reasonably Non-Geeky Guide to Lens Tests and Reviews

Hardly a day goes by that on some forum somewhere a nasty argument is going on about lens testing and reviewing. (Going forward, I’m going to use the abbreviation R/Ts for Reviewers and Testers.) As with all things on the internet, these discussions tend to gravitate quickly to absolute black-and-white with no gray. This one’s great; this one sucks.

I made a bit of fun about his in a recent post. The truth is that most sites have worthwhile information for you, but none gives you all the information you might want. I’ve spent most of my life in various types of scientific research, and I’m accustomed to evaluating information from an ‘is this valid’ standpoint, not just ‘what does it conclude’. And because I get to see behind the curtain of the photography industry I have a different perspective than most of you. So I thought I’d share how I look at tests and reviews.

For the Paranoid Among You

I see this speculated about online sometime so let’s talk about the Pink Elephant in the room. These things are obvious facts of life. They aren’t massively influential like some people think they are, but they are real to some degree.

R/Ts like be first. First reviews get the most hits. Plus it’s cool to be the first one with a new toy. I’m guilty of this myself sometimes.

Manufacturers reward R/Ts. It may be getting a prerelease copy to write that first review. Maybe an invitation to be a paid speaker at a convention. some free loaner gear, or photography trip or two. Rewards and may have some influence, but there are no huge checks that make R/Ts say whatever the company wants, though.

There are occasional threats, too. Outright ‘cease and desists’ occur, although rarely (yes, I’ve gotten a couple). More often there is a discussion about why the manufacturer disagrees with something an R/T said. Sometimes the manufacturer is right, too. They have been with me a couple of times.

Most R/Ts don’t care much about rewards or threats. The biggest R/T sites basically get the best treatment from all manufacturers. Other R/Ts buy everything retail and won’t accept a loaner or any other reward to make sure you know they aren’t being influenced. You can usually figure this out pretty quickly if you read more than the graphs from their reviews.

R/Ts want to be credible. If they aren’t credible you won’t go to their site (with a couple of exceptions).

R/Ts do what they do for a reason. They are getting something out of it whether it’s direct income from advertising, publicizing their expertise, or even just getting a tax break for wallowing in their hobby.

The vast majority of R/Ts try hard to be impartial, but we all have our influences and preconceptions. If their income depends on ‘click-through’ purchases there may be a subconscious tendency to praise things a bit much. If they make their revenue from Google ads, generating controversy and lots of visits may be important.

Every one of us makes mistakes. I have, lots of times. I’ve seen others do it lots of times. I, and a lot of R/Ts tell you when we find an error. Some others quietly change results when they figure it out. Some others don’t ever realize they’ve made a mistake, or refuse to admit it.

Reviewing and Testing are way more time-consuming than you realize. Carefully testing or reviewing a lens takes dedicated, expensive equipment and a week or more of full-time effort. When you say ‘just test more copies’ or ‘why does it take so long’ you’re basically asking someone to cut corners and be sloppy.

Persistance of Memory, Salvadore Dali

The Persistence of Memory, Salvadore Dali

 

So What Do I Look At?

Everything I can. But I ‘consider the source’ for every bit of that information, because there are lots of good, but different, perspectives out there. There is a lot of disinformation, too. But in general, there are several broad categories of R/Ts that I look at, each with a slightly different perspective.

Crowdsourced Information

In some ways, crowdsource is the best information you can get because it summarizes a lot of people’s experience. The trouble is it takes a while to appear. Early adopters don’t have very much of this resource to go on.

It’s also important to know the sources of crowdsourced information and take it all with a grain of salt. There is a nice-sized cottage industry that posts positive and negative product reviews online for a fee at places like Yelp or Amazon. Positive and negative forum posts can be purchased, too. I don’t know if it happens in photography forums, specifically, but I don’t rule out the possibility.

Source: Carna Botnet via Science, 2013.

Internet activity at some given moment in time. I wonder who that guy is in the middle of the Sahara. Source: Carna Botnet via Science, 2013.

 

There are three types of online information I consider.

 

One Guy Said Online

This is really good information if you know the guy. If you don’t know them, at least by reputation, it’s probably not worth the electrons it’s printed on.

 

Lots of People Said Online

Once dozens or even hundreds of photographers have rated and commented on a given lens the information should be better than ‘one guy’. There’s going to be some bias, of course, reviewer expertise varies, and occasional mass hysteria does occur. But crowdsourced opinion does give you a good idea of how satisfied people are with their purchase.

The trouble is it’s fairly worthless right around release time. Fanboys and trolls dominate these discussions, and people are very willing to comment about what they don’t know about. One thing I do recommend: most crowdsourced review sites let you sort the comments by date. If the lens has been out a while, start with the newest reviews first. Far too many ‘reviews’ are written well before the lens was actually released for sale or during the hysteria around release time.

 

Lots of People Showed You Online

This is the best; the gold standard. You can look at actual photographs taken with the equipment you’re interested in. There are still shortcomings, of course. Online jpgs tend to make lenses with a high contrast look better than those with high resolution, for example. Postprocessing and photographer skill levels vary. But looking at a lot of images can give you the best idea about what a lens can do and whether it’s for you. More importantly, you actually know the person had the lens they’re commenting about.

Of course, us early adopter types don’t have a lot of online images to browse through. And, personally, I tend to get bored after a fairly short time and don’t look at all that many images. I also find myself creating self-fulfilling evaluations. If I want the lens, then bad shots are obviously by bad photographers and good shots reflect how good the lens is. Or vice-versa.

 

A Photographer Reviewed Online

I consider this is much different than ‘someone showed you images online’. Reviewers do their thing professionally. They have a site where you can find out about them: what they shoot with, how they make their living, what else they do. This is important to know because all reviewers will have some preferences and each will have a slightly different shooting style. Most take identical images in every review so you can get some direct comparison between this lens and that one.

A photographer’s review gives you the most information about what it’s like to use the lens in question. Here is where your find out things like how a lens handles, how quickly it autofocuses, how it feels in the hand, and how it compares to similar lenses the reviewer has used in the past. You’re also more likely to find out about color casts, bokeh, light flare, and behavior on specific cameras.

I would never buy a lens without either trying it myself or reading a couple of reputable reviews. OK, I would because I’m an incurable early adopter, often to my detriment. But I shouldn’t.

Critical reading is very important here. Far too often we get a huge forum debate about what “Joe The Reviewer” said, but nobody in the debate knows anything about Joe the Reviewer; they just read the summary of his review. Or more often, they just read someone else’s post about the summary of his review. What kind of photographer is Joe? I don’t mean good or bad. Does Joe work mostly in-studio, or do landscapes, or street photography, or just review lenses? If you shoot a fast-aperture, wide-angle lens in clubs at night, Joe’s review using it mostly stopped down for landscapes may not be pertinent.

I had an early experience with this when the late Michael Reichman, whom I had the utmost respect for, wrote a very positive review of the Canon 70-300mm DO lens. I bought it on the basis of that review (really just the summary of that review) and HATED it. Before I hopped on my local forum and started saying Michael must have been bribed by the camera company, or I must have a bad copy, I actually looked at the review more carefully. Michael had used the lens for images of large shapes with strong contrast and blacked-out shadows, never shooting into the sun. He instinctively knew the limitations of the lens and made great pictures with it. I was using the lens in an entirely different way, one that brought out all of its flaws.

 

Editor’s Note
Upon reading this portion of Roger’s piece, I wanted to agree and interject my own opinions on the topic. As someone who has written for dozens of photography websites over the years, and written hundreds of articles, as an editor, I’ve always ensured that writers understood two things. First, the FTC requires that all reviews online that have been paid for by a company, are accompanied with ‘clear and conspicuous disclosures’. This means they’re required to state if they have been paid in any way for the product review. This can be as easy as stating “I received this new camera from ____ to test”, but it needs to be said somewhere to avoid a potential punishment up to $16,000. Secondly, I discuss credibility. Often, websites and other sources share credibility among all the writers and authors – less people read who the article is from than you might think. So it only takes one person to potentially ruin that credibility. While I can’t enforce their credibility, I do make sure that all gear for review comes from a third party (Often, companies I’ve worked for have dealings with B&H Photo or other large camera stores, where we can get 30 day loaners for gear reviews, without sacrificing impartial dealings by going through the manufacturer directly), and make sure they understand the stakes at hand.

 

That said, I’ve had companies in the past (I won’t name names), who have tried to directly bribe me with reviews by saying “We’ll send you this product to review, and if you like it…review it and keep it. If you don’t like it, just send it back”. This is absolutely bribery, and it absolutely happens. It’s important to know your reviewer, and not only trust their judgment, but their integrity as well. That said, the best way to develop an opinion on a piece of gear is to try it yourself.

Zach Sutton

Imatest Tests Online

There are many of these sites. All have very good intentions, and most give very good information. BUT, and this is a HUGE BUT, all Imatest labs differ. The Imatest program always gives scientific appearing numbers and very pretty graphs. But few Imatest R/Ts tell you the very important details about how they tested. How far aways was the chart? What was the type, size, and quality of the chart? (A small inkjet-printed chart ISO 12233 chart will give very different numbers with the same lens than a large, linotype-printed SFR Plus chart or backlit film transparency chart.)

Lensrentals.com, 2013

Lensrentals.com, 2013

 

Even different lighting can give different results. Then we have to consider if the images were raw or JPG files, and where the tester considered the corner and sides to be. For example, the ‘corner’ measured on an ISO 12233 chart in Imatest is in a very different place than the corner measured using an SFR Plus chart.

You also don’t even know if the corner number in that pretty graph compared and average of 8 measurements (horizontal and vertical in each corner), or just one. Did that center measurement average horizontal and vertical, or just give the higher of the two? Or was it and average of 4 (all 4 sides of the center box)? Oh, and speaking of that, the difference between the vertical and horizontal lines is NOT a direct measurement of astigmatism and anyone who claims it is, well, don’t put a lot of emphasis on their review.

203149-imatest-results-averages

135f2_YB34_26_multi_cpp

 

There are at least a dozen other variables in Imatest testing. Most labs are consistent within the lab, except for variations in the camera used (which also makes a huge difference over time). So comparing Lens X to Lens Y using Joe’s Imatest Lab is a valid comparison. Comparing Joe’s lab numbers for lens Y to Bill’s lab numbers for lens Y is not. But if both Joe and Bill all say Lens X is sharper than Lens Y in the center but softer in the corners, then it probably is. And if Steve says so too, it almost certainly is. If Dave says it’s not, well maybe his copy wasn’t as good, or maybe his testing methods are different.

So, as someone who tested literally thousands of lenses for several years using Imatest, I recommend you take all of these sites in as a gestalt and do a kind of mental meta-analysis of them. There’s good general data there, but splitting hairs and over-analyzing it will lead you to a lot of wrong conclusions.

DxO Analytics Tests

Here’s something you may not know. DxOMark is not the only place to find DxO Analytics data. So there is a good confirmation source for DxO’s data that I use constantly. SLRGear/Imaging Resource uses DxoAnalytics targets and software for their tests and presents them in a nice, easy-to-use format. The also are very transparent about the tests they use and the way they perform them, which gives me a high comfort level with their results.

In general, DxO Analytics is a computerized target analysis, like Imatest. There are a variety of targets for different tests, including dot targets and slant edge targets, among others.

 

 

This gives some different information than most Imatest reviews and it is presented in DxO’s proprietary ‘blur units’ rather than more standardized MTF or MTF 50 results. We all know DxOMark likes to compress everything down to a single, and in my opinion less than useless, number. (Well it has some use since we immediately know that someone who says “DxO rated it as 83.2, the best lens ever!!!!” is a fanboy who doesn’t understand testing at all. Resist the temptation to try to reason with unreasonable people.)

But if you dig deeper, there’s lots of good information: how the lens performs as you stop down, a nice look at the sharpness over the entire field, and other goodies. It’s often more in-depth information than what you get from most Imatest sites.

I use DxO to expand on data from Imatest testing sites. It may give a more complete picture, but not a more accurate picture. There are lots of Imatest sites and that dampens the ‘copy variation’ problem down a lot. There are only two DxO sites so when they test a bad copy, there’s not 6 others to compare it to.

As an aside, nobody, no matter how much testing equipment they have, can take a single copy of any lens and decide it’s a good or bad copy. They can rule out horrid decentering or tilt, or truly awful performance. But that’s about it. They may have the best copy out of 20, or the worst. Trust me on this. There’s a reason I will no longer publish ANY single lens performance data; because I’ve been burnt by publishing data when I only had a single copy and thought it was good.

Also please don’t compare my series of 10 and say they should do that, too. I publish very limited data on a larger number of lenses, but I don’t go into the depth and detail that an Imatest or DxO site does. Not even close. They probably spend as much time testing their one copy thoroughly as I do just doing MTF curves on 10 copies. Like I said at the beginning, we’re each showing you different things.

Lensrentals Optical Bench Tests

First, we don’t pretend to be, nor intend to become, a full R/T site. We complement them, not replace them. We’re doing somewhat experimental and geeky stuff for our own purposes and letting you guys look over our shoulder. The feedback we get helps us refine our methodology. But our results, while not a true lens test in the reviewer sense, do give you some very different data that I think is worthwhile. How is it different?

 

OlafOpticalTesting, 2016

OlafOpticalTesting, 2016

 

  1. There is no camera. This is good because it eliminates all of the variables of testing different lenses on different cameras. It’s bad, because many of you want to know how the lens is going to behave on your camera.
  2. It’s done at infinity. For most lenses this probably makes no difference. But for Macro lenses, designed to work close up, I’d put less emphasis on our infinity testing. For wide-angle lenses, I’d put more emphasis on our optical bench testing, since target analysis testing will be very close, sometimes at 4 or 6 feet, focusing distance.
  3. Because it’s experimental and evolving, we’ve refined techniques so some older results aren’t correct, sort of like older Imatest results done on a last generation camera aren’t the same as tests done on a new camera. For example, I just realized (after a subtle hint from Zeiss) that our 55mm Otus results were done before we started rigorously testing every lens with and without cover glass. It turns out the Otus is a better performer with glass in the path, as you can see below, so the data on our website has been incorrect for a year. We screw up sometimes. That’s what happens when you do new stuff.
  4. We only provide very limited testing. Basically at this point we give you wide-open MTF and variation data. And if you don’t like MTF graphs, well, other than ‘higher is better’ you don’t get much out of our data. If you speak MTF, then you get more information than you would out of Imatest or DxO graphs.
OlafOpticalTesting, 2016

OlafOpticalTesting, 2016

 

So when is our data important?

  1. It’s really the only data for copy-to-copy variation out there.
  2. It’s multiple copy data, so it tends to smooth out the sample variation issue. Thought of another way, while we present only one small type of test, we do it on as many copies as you’re likely to see from all the other R/T sites combined.

So I look at our data to see about sample variation, and to look at the MTF curves. That’s about it unless you’re into the more geeky part of testing. Now, because I’m into engineering quality I’d look at our teardowns, too, because there’s nowhere else to find that kind of information. But that’s not really testing. That’s just cool geeky stuff.

TL; DR Version

There’s a lot of information put out there. Most of it is reasonably accurate. Not one site or place is the ‘correct’ one. No single review is the ‘best’ one. It’s like real life. Each of them gives you some information that you can choose to use, or not use, when you make your decisions.

If you’re an intense early adopter, well, just go ahead and get the lens. There’s no sense making yourself crazy trying to justify why you want it, and there’s no way to know, absolutely know, if it’s right for you without using it yourself doing the kind of photography you do.

If you want to make a logical, rational decision about which lens you should get, and it’s not ‘right-around-release-time-insanity’ o’clock, then do some intelligent reading. Screen several Imatest and DxO Analytic sites and get a feel for what they think of the lens. Look at Lensrentals data and see about variation and if, perhaps, there may be some difference between close and infinity performance. (For example, if we find the lens is better than most, but all the Imatest sites say it isn’t, there’s a good chance the lens is better at infinity than it is close up.)

Then read some real-people reviews, looking at more recent ones first and discounting, as much as you can, the ones that came out right around release time when fanboys, trolls, and maybe worse were clogging up bandwidth. Finally, read a couple of professional photography reviews and get some more input on how the lens handles and behaves in various conditions, and look at some photo sites for images taken with the lens. Remember that online jpgs, unless they let you download 100% images, aren’t going to tell you a lot about how sharp the lens is (although it may expose some flaws). But they should give you an idea about color, flare, and bokeh at the very least.

Or, be a fanboy, find one site that says exactly what you want to hear, and go post on your favorite forum that you have found The Truth and all other sites are incompetent liars.

Roger Cicala

Lensrentals.com

August, 2016

 

 

Author: Roger Cicala

I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.

Posted in Photo
  • Mark

    This is purely anecdotal, but one day I took 5 prime lenses (EF 40f2.8, EF 100f2.0, EF 300f4L and two ancient 50mm Zuiko OM mount lenses), and tested them from MFD out to 70 feet to see if I could tell how my resolution changed through that range. Except for the 300F4L, which was indistinguishable throughout the range, the other lenses I know have simple focusing mechanisms. All of the other lenses were softer at MFD but indistinguishable from 5 ft out to 70 ft. So take it for what it’s worth, just an anecdote of 5 relatively simple non-retrofocus and non-zoom lenses. I was surprised how similarly they acted. I have no idea what the 70 ft to infinity performance is on any of them, or what a zoom would do, a zoom might vary in crazy ways at different FL’s for all I know.

  • Afsmitty

    Can’t disagree that Ken Rockwell has a lock on the stats and simple statements markets.

    However, if you’re actually interested in an infrared focus index you should know that what he typically sites as an infrared focus index is just a normal focus index for the various focal lengths on a non-parfocal zoom. (Typically the red numbers correspond directly to the focal lengths stamped on the barrel.)

    For instance, if you are shooting at 24mm and you want to focus at infinity, you should align the infinity mark with the red 24. If you’re shooting at 105, the infinity mark should be aligned with the 105.

    That’s the idea anyway. You’ll probably get better results if you adjust in the viewfinder, or better yet use the LCD and magnify. I only bring it up because if you are shooting infrared, you shouldn’t let those marks lead you astray.

  • Brandon Dube

    On-axis MTF50 with a 0mm coverglass for the otus is 40lp/mm. * 21.64mm picture height = MTF50 of 865lp/ph. With a 2mm coverglass it’s about 56lp/mm, or 1211lp/ph. Around 350lp/ph.

  • Brandon Dube

    MTF is a “lump sum” measurement and the standard for image quality. it is the greatest level of reduction that is still useful. MTF50 and similar metrics are not informative. SQF has merit, but only in the context of human consumption. Pure MTF is useful for man and machine alike. MTF itself is a total truth; unless the measurement was made incorrectly, one cannot lie with MTF.

    However, there are a lot of parameters that affect MTF – focusing distance of the MTF bench, focusing distance of the lens, temperature, color, the coverglass used or lack of one if it is not, and perhaps a few non-obvious ones. Three of these are distinctly useful to lie with MTF, or stretch the truth if you would prefer that language. Focusing distance, color, and coverglass.

    There is nothing wrong with a coverglass-less measurement. In fact, it wasn’t until 2004, some 10 years after the popularization of digital cameras that anyone even cared about the coverglass in lens design at all. The model isn’t very complex; the designer puts a x.xmm slab of N-BK7 glass in front of the sensor. The actual stack contains 4-5 different materials, birefringence effects, color filtering, and a host of other properties.

    If you put an otus lens in front of film, it behaves like the coverglass-less measurement. If you put it in front of a kolari modified A7 series camera, it behaves like a ~0.5mm coverglass measurement. If you put it in front of a Canon camera from the 5Dc’s generation or older, it behaves like a 3mm coverglass. In front of most micro four thirds cameras, 4mm coverglass. If you put it in front of an Arri Alexa, it behaves like a 7mm coverglass measurement. The list goes on. We could also do best focus at all field points and remove field curvature and tilts of the image plane; it’s still a truthful measurement, and some manufactures do exactly that. We could do monochromatic or quasi-monochromatic measurements and remove lateral and axial color.

    I wouldn’t say the coverglass-less Otus measurements are “questionable” – they are merely a different truth, a different picture.

    If you want to question my morals or any of that – around 65% of OLAF’s database is still measurements I made, I’m currently employed at Optikos, the company that made OLAF the machine, and can work at Zeiss any summer I choose. I will likely go next summer – the timing didn’t work in ’15, and I could not go this year for personal reasons.

    If you would like, I could get you a tour of Zeiss SMT or Meditec whenever you would like. I have a good working relationship with them.

  • Edna Bambrick

    Brandon, that’s a lot of fancy talk for a guy that didn’t bother to question his questionable results and went on to repeat the same mistakes in test after test last summer.

    This is probably what happens when someone has a pre-existing bias and looks to confirm it through testing.

  • Brandon Dube

    The trioptics MTF bench (OLAF isn’t an MTF bench and couldn’t measure MTF if we wanted it to 🙂 ) can be used to predict the MTF results you would get with imatest. Under ideal conditions for each, Imatest still outpaces an MTF bench; it’s simple physics. Mounting and unmounting lenses on a camera bayonette and focusing them on a test chart takes well under a minute. An 84-field-point MTF test involves moving the target generator and collimator, as well as the image analyzer around. These take time – a lot longer than focusing the lens.

    The upshot is that the MTF bench is very accuracy and repeatable. Imatest is… less so.

    To predict the MTF on a camera from lens-only measurements, assuming the lens was measured with the proper coverglass, one need only multiply by the MTF of the sensor. That MTF can be broken into two components that multiply; the “bare sensor” and its anti-aliasing filter. The bare sensor part is fairly straightforward; it is the fourier transform of the point spread function of the sensor. For rectangular pixels, the PSF is a sinc function, and its FT is a triangle function. The anti-aliasing filter is less straightforward, but can be experimentally determined.

  • Roger Cicala

    I agree with your points overall, Edna, but with one exception. Imatest is so ‘setup critical’ with a need to be oriented within a fraction of a degree in three planes, and a mm or so in centering, that while it’s a good indicator of center sharpness, it can be a very weak indicator of both off center sharpness and especially of a tilted or mildly decentered lens. For example, I can test a lens today, put it on the shelf and test it again tomorrow on the optical bench and the results vary less than 1%. When I did the same thing with Imatest, a 5% variation was about average (and that was after months of practice refining technique; when we started 10% was quite common). Reproducibility was the main motivator of change for me, not just time.

  • Edna Bambrick

    Andre, if you have the Otus 55 did you really ever have a doubt? (grin) It never made sense to me.

  • Edna Bambrick

    My main point is that no one is going into the field with an OLAF and
    taking photos. Imatest system (lens and sensor) tests still seem to be
    the most relevant tests for the average user when considering whether
    or not they have a good or bad copy and what will end up being their
    actual results in the field, as well as their own field testing for how
    they will use the lenses.

    When I do portraits, I use the Nikon
    70-200 at f4 – 5.6 initially because 98% of the time those images will
    be sharp enough (too sharp and show too many flaws that are time
    consuming to fix in post) and the fast AF allows me to catch expressions
    as they are revealed. If the subject is willing and time allows I’ll
    swap in the Otus 85 (f2.0 – f4) or Apo 135 (f2.8 – f4) and take a few
    more that may or may not end up being keepers.

    I understand that
    Imatesting systems is more time consuming and OLAF helps with the volume
    and r&r needs of a very active rental company but it would be nice
    to see some OLAF results tested against logical systems so one can have
    apples to apples or at the least apples to oranges results to compare.

  • Edna, it’s a good question and one without a good answer. In this case the difference wouldn’t change my posts headlines or conclusions. Unfortunately, you are correct, some reposting sites put screaming “New Sony better than Otus” headlines up to create controversy. I can’t control that, but I have posted this here, will correct the last article with an addendum, and have been in contact with both Zeiss USA and Zeiss AG and they are aware.

    This isn’t the first, nor will it probably be the last, time I’ve had to change a results. The one I still feel worst about was when I bowed to pressure and tested a single copy of a lens that looked fine, but that wasn’t. I had to apologize to Sigma for that one.

    But that was one of the motivators in writing this: I say all the time that every site, mine included, is going to have misinformation sometimes, no matter how much we try not to. The kind of odd thing here is when the Otus 55 results were first done, I thought they looked wrong and we redid them getting the same result. We just didn’t know enough.

  • Edna Bambrick

    Sadly, it does take all of these sources to come to a conclusion. I love my Zeiss 21mm Distagon but my 14-24 Nikon performs just as well except at f2.8. But because of the flexibility and the AF, the Zeiss sits on the shelf and the Nikon is always in the carry bag.

  • Edna Bambrick

    At the end of the day, it ends up being about a system and technique. I shot with Sony for several years but kept having the problem of getting my images to be as sharp as what I saw in the viewfinder. Was it shutter shock, bad technique, sensor glass thickness, chief ray angle, a wobbly tripod etc? Who knows.

    But finally equipped with an RRS, and equally sharp Zeiss 135 APOs I conducted a side by side test with a friend using a Nikon D810. When I looked at his RAW images I was blown away by the results and bought an 810 that night. I never looked back except to make some comparisons that always confirmed that mind blowing first impression.

    Now I see results like the Sony/Otus comparison and despite OLAFs precise numbers, I never see Sony results that match what I see in field results. Even with eye detection, 5 axis IS, highly rated/reviewed lenses and higher pixel densities the images never approach the numbers.

    Below is a hand held photo (you can confirm in the reflection) taken with the D810 and Otus 85mm on a less than static subject and without flash or a focusing aid (other than the OVF of the D810 and the DK17M). Even in R/T reviews that attempt to promote the sharpness and clarity of the lenses/system they are using I never see an image as clear as this. And this was a casually taken photo not a lab or testing situation where I would always use a tripod and better lighting and lower ISO.

    Hopefully the image can be clearly seen on here, if not it’s on DPR also under my name.

  • Edna Bambrick

    Commendable honesty and integrity from Zach ! Bravo.

    Roger – how do you put the toothpaste back in the tube when it comes to the new Sony and the Otus results ?

    You make a banner headline and that news travels pretty fast and as first impressions often go, they stick more than the reality. A good example is in politics this week. e.g. “$400M ransom paid to Iran” when this is in fact money the US owed them for decades and was decided in international courts. Of course the timing is unfortunate but at the same time, it makes sense that when trying to establish trust one pays their old debts.

    Last but not least, that loner in the middle of the Sahara…. HotCamels.com ?

  • … That might be the longest, most interesting, and most enlightening leadup to a pun I’ve ever seen. Well played, sir, well played indeed.

  • obican

    Aside from the technical part of R/T, there are times when I’d just like to know about/remember some specs and then, Ken Rockwell is the king. He has been using the same format since the dawn of Internet and he lists all the useful specs, even the ones you don’t know were useful.

    This becomes even more apparent when I’m trying to put up a chart for some reason. Then you realize most websites blatantly copy manufacturer which may or may not include the weight of the kit lens or maybe the battery.

    He also lists some ergonomical inconsistencies when he sees them. An oddball filter size, noises made when shaken (Thought my Sigma 30/2.8 was broken when I had it for the first time), number of apertures, Infra-Red focus index etc. I don’t have to skim through a wall of text and personal gloating of an online reviewer to find out if the lens has a focus breathing***.

    His optical tests are not always accurate (Most of the time he doesn’t even have the lens) but I actually quite like how he simplifies a lot of things. He does it mostly the right way, the exact opposite of how DXO can’t. DXO bakes performance into single numbers and makes you think a sharpness score of 25 is so much better than a 22 and 1430 sports score of a sensor is miles ahead of a score 1079.

    If you want to split hairs (Sometimes we all do), the best way is being systematic and testing multiple copies on even ground (Just like you). But do we really want to split all those hairs? Do I really care if a Sigma Art on Canon 5dm3 is sharper than the FE 55 on a Sony a7? I’m not going to switch to Sony just because it’s 5% sharper in dead center at f/2. Rockwell would simply say that Sony is optically superb and Sigma is the best AF 50mm but not by that much.

    I’d like to know if it’s a clearly a dud or overall sharper than an old 50/1.4 SLR lens (site:photozone.de). I’d rather like to know if there is a high risk that the copy I’m going to buy would be unusably bad (site:lensrentals.com). I’d like to know if it will die on me in 2 years (site:facebook.com/groups). I’d like to see the overall rendering of the lens (site:flickr.com).

    Then I usually go out and find a way to handle the lens myself. I can’t really rent stuff where I live. Luckily, it’s not so hard to go into a store and ask to handle the lens. Some companies are much better in that regard than others. Fujifilm holds street workshops every weekend where they give you every single camera and lens you want for a few hours and you do a photowalk with their instructors. They don’t press on you to buy or do anything, it’s just free time with cameras and lenses. Sony has a concept store where they don’t sell anything at all and you’re free to try anything (I walked out with an A7RII and 135/1.8 to see the AF performance and rendering).

    These experiences are much better than reading reviews. Also most of the time, I discover a lot of stuff which were never mentioned on R/Ts. Again, not technical stuff, I don’t even look for them (However I think Fuji has a terrible center variation and decentering problems on most of their lenses, I could see some issues without even looking).

    For example, did you know that only wide angle Fuji primes have push-pull focus rings and distance scales? Even though they all have electronic manual focus, they actually follow the distance scale on the lens. So you can actually or shoot at hyperfocal without even looking at the screen. This is only a feature on wider primes where it would be useful and longer lenses, pancakes and zooms all have infinitely turning focus rings sans distance numbers. Never saw that info on an online review.

    TL;DR; Ken Rockwell does a better job at R/T than most people without even seeing most of the gear at all.

    ***That’s also another reason I like Photozone, I know exactly where to look in their reviews to find the piece of information I need.

  • obican

    Speaking of floating elements, it would be geeky (and both quite useless and time consuming) to test one of the old CF FLE lenses from Hasselblad V system which had a FLE ring so you’d have to set it to the correct focus distance. With deliberately choosing the wrong settings of course ;).

  • Andrew, it’s a lens design issue – basically lenses with short exit-pupil distances are more likely to be affected. This is generally wide-angle lenses, but it can be others, too. We’ve have to learn as we go with that. Most 50mm lenses on SLRs don’t have the issue, and last year’s tests we weren’t rigid and OCD enough with checking every lens carefully.

    Roger

  • Franck Mée

    I remember testing the Sony 16-50 mm when it got out. The Sony sample was so astoundingly bad that we asked for other samples to cross-check it; Sony sent us another one, and one of our readers who had just bought a kit dropped by with his own. The second Sony sample was better than the first, and the reader’s one was definitely the best (or the “less-bad”) of the three. In France at least, Sony are definitely not handpicking good products for R/Ts.
    (And this lens had the worst copy-to-copy variation I’ve ever seen, by the way. Don’t know if it improved with time.)

  • It definitely has implications for adapting that lens to other sensors and how much of its performance you can extract.

  • l_d_allan

    I’m curious about the impact with an adapted Otus on a mirror-less body such the a7Rii. Possibly 100 to 200 lp/ph with ImaTest? It depends? Who knows?

  • l_d_allan

    Mr. Reichman passed away May 18, 2016 (the day before I became officially old). This wretched chief of sinners has no idea where he was spiritually, but my prayer is:
    * Rest in peace
    * Gone to a better place
    * May God have mercy (and grace) on his soul
    Amen and shalom.

  • Roger, thanks for showing the corrected 55 Otus results: it’s great of you guys to correct the record! As you know, there was much discussion of the former results in the various forums with some amount of hand-wringing amongst us enthusiasts.

    Is the cover glass factor a 55 Otus-specific issue, a Zeiss issue, or a Canon sensor issue, ie. affects all results for all EF mount lenses?

  • l_d_allan

    My (cynical?) speculation is that well known pro’s like Mr. Kelby might pre-order 10 copies, then have an assistant attempt to sort out the “good copies” vs “bad copies” vs “stellar 5-star copies”. Then return the 9 also-ran’s.

    I may be delusional on this, but I think that with attention to detail, a well lit brick wall, and proper technique, I could sort out 10 copies of primes with high variance (low consistency) into “bad copies”, “so so copies within spec”, and “good copies in the upper 50%”.

    Or at least identify outliers / rejects? Or ???

  • taildraggin

    FWIW, I’m fairly certain that I caught a vendor plant on an influential forum. He was a fairly talented, but unknown artist from London that posted up many “can’t wait” build up messages each day for (exactly) 60 days before release, then totally disappeared on launch day.

  • Lynn, I know screening goes on, but I’m also aware most companies aren’t very good at cherry picking, honestly. Sony couldn’t find a good lens in a batch if they wanted to. Zeiss or Canon certainly could, although I don’t know to what extent they do.
    What cherry picking there is, though, is going to be those early review copies for certain, and another reason I recommend waiting for the “I bought mine at the store” reviews. Although, full disclosure makes me admit I’m not the type who waits very well.
    Roger

  • l_d_allan

    Another great way to evaluate a lens … which RC/LR may be reluctant to mention … is to rent and then maybe buy.

    Something that I didn’t notice being mentioned … I assume that some of the better known R/T’s get “cherry picked” lenses. Even if they pay for the equipment (and not bribed with long-term loaners), their copy would perhaps have been rigorously tested to be not only within spec, but the equivalent of a “blue printed NASCAR race engine”.

    To name names, Brian Smith as a Sony Artisan probably doesn’t use an average but notorious Zony FE35 f/1.4 that may or may not be a bad copy. Scott Kelby isn’t using a “runt of the litter” EF 11-24, but rather a “pick of the litter”.

    Or not? Actually, I hope I’m mistaken. I learn more when I’m wrong.

    I also give huge credibility to R/T’s that at times give negative reviews, rather than gushing pablum. PhotoZone.de has been brutal on not just a few lenses. I don’t recall anything but positive gushers from low credibility Ken Rockwell, for example.

  • I would like to know too. I think for better lenses with floating elements it should be very little different. But I don’t know for sure.

  • John Dillworth

    Another excellent article. Yes, companies don’t generally pay for reviews but there seem to be plenty of early access programs that make gear available to more than a few select people. So the day the product is officially announced there are plenty of reviews (unsurprisingly, all great) the day of or the day after the release. I understand companies like Fuji have to compete with the big boys and this is a pretty good way to get free word of mouth. So what’s a consumer to do? Pretty much disregard reviews from people that have had early access. Maybe the reviews are honest, maybe a little embellished. There really is no way to know whats true beyond specs and features. So to me, the early access reviews are worthless. I just have to wait until someone acquires the gear the way the regular consumer does. Probably best that way. I might not be the first kid on the block to own the newest stuff, but I’m not the guy that has to deal with all the early problems some camera manufactures seem to have (looking at you Nikon & Fuji)

  • Franck Mée

    As a tech journalist, I’ve been an R/T for 5 years and I couldn’t agree more.
    Yet, I thought I might send some return of experience of my own.

    About the “We’ll send you this product to review, and if you like it…review it and keep it. If you don’t like it, just send it back”: I’ve known this situation a couple of times. I know for a fact this usually happens only for shitty products. When the product is good, manufacturers don’t feel like they have to offer it in the hope you won’t hit too hard.
    So they usually ended up being the products I really released the Kraken on, using nasty metaphors and harsh remarks on how bad they were. And I know I wasn’t the only R/T who worked that way: never getting a second bad gift is a very low price to pay for an exhilarating occasion to blow off some steam. >:-)

    About bought publications, in 2010 I received on my personal address a message from a subcontractor of a manufacturer. They had noted I was very active on a specific, well-known forum in my country, and proposed to send me some of their products so I could spread the word on how wonderful they were. Of course, they just hadn’t realized that I was a journalist, nor that the reason I was so active on this forum was that it was my job to answer my own readers following my own articles. So I forwarded their messages to my boss, we had fun a few days, and then we forwarded the whole lot to the national subsidiary of this manufacturer with a big “WTF ?” as the mail subject, and we had even more fun watching this go up the food chain in the company, right back to Asia, until the subcontractor vanished into thin air — apparently, their contract was abruptly terminated.

    I don’t share these two anecdotes to say I’m the good, white knight who can’t be bought, but just to say I feel you’re very right about something: most R/Ts don’t really care about gifts. We care about our own credibility, and we don’t get that by selling ourselves. And it’s a time-consuming, repetitive and often underrated task, so when we have a chance at a good laugh during office hours, it is so much more important than getting a product for free!

    One last story if I may: manufacturers know R/Ts. Generally, they know better than to try to bribe them.
    Some background first: a few years back, I worked for a reviews website whose policy was to totally discount price tag when reviewing a product. The idea was: if you have a €100 and a €150 products, which are both average, and you give a bonus (say C+ instead of C) to the cheaper one, it might be confusing two months later, when the prices get even and some readers will think the first one is better than the second one. So we just didn’t use price at all during reviews, and entry-level gear logically got lots of Cs or Ds; but when you selected a specific price point, you could know that the first one was the one that coped best with our lab, whatever its price might have been when it was released.
    Then there was a manufacturer who had decided they wouldn’t let one of their products get an average grade, so they just refused to lend us one of their entry-level cameras. We were saying something like “well, we know it’ll probably get a C, but for this price every competitor got an E or an F, so when readers select sub-€150 products yours will definitely be on top of the list”. And they still retorted “no, your test protocol is too much of a challenge for this kind of product, we will let you review the best ones but you won’t have this one”.
    We actually got one through one of our readers and it was, as expected, average — and by that I also mean: far better than anything else at its price point.
    But since then I know one thing: rather than to try and bribe them, smart manufacturers actually simply prefer to dodge challenger R/Ts.

    (Yeah, I wrote all that just to get to this conclusion. Sorry about that, I just HAD to get it out. Yet, this was a true story I lived myself, and I believe the lesson is valid.)

  • sala.nimi

    Good info as usual.
    You say that you measure lenses with infinite focus and they may be different focused closer. it would be nice to know how much different

    Is Michael Reichman from Luminous Landscapes dead? His site became a pay site so I did n’t know. His site taught me a lot about digital photography when I bought my first digital SLR.

Follow on Feedly