The Great Flange-to-Sensor Distance Article. Part II: Photo Cameras
If you’re a photo shooter, chances are you didn’t read Part I of this series on Cine Cameras. If you’re at all interested in the methodology, you should give it a glance, because I’m not going over that again. Today I’m going to just talk about photo bodies and what we found measuring flange-to-sensor distance.
Well, how did we get here?
In the previous article, we pointed out that variation could make cinema lens distance scales inaccurate and, at least with wide-angle lenses, make it impossible to achieve infinity focus on lenses with hard infinity stops. Neither of those things matter for photo cameras. Distance scales aren’t critically accurate on photo lenses, and the lenses all can focus past infinity; they don’t have a hard stop.
Since we assumed that flange-to-focus distance wouldn’t matter for photo cameras, you might wonder why we bothered to test them. Mostly because we could, I guess. Plus, no one had done it before. The fact that nobody’s done it before is always attractive, and we test a lot of things on that basis. Most of the time, we don’t write about it because all we find is it really wasn’t worth doing.
So, in true internet fashion, we assumed photo bodies would vary more than Cine cameras, and we assumed that testing for it would be a waste of time. As usual, both assumptions were wrong. Roger’s Formula of Internet Wrongness is W = iS x A2 where W is Wrongness, A equals the number of incorrect assumptions, and iS is the Internet Stupidity Constant (.737475706964). So, our thinking was 2.95 Internet Wrongness Units of bad. (Since I have invented this concept of Internet Wrongness Units, I hereby declare one Wrongness Unit should henceforth be labelled as a Roger.)
And we weren’t even done with our wrongness there. You see, we bought the Denz equipment because we thought its biggest purpose was to improve matching Cine lenses to Cine cameras. Turns out that for us, at least, its greater purpose is in detecting sort-of broken photo cameras.
Oh, I left an Easter Egg — the Internet Stupidity Constant. First one to figure it out in the comments wins a cool prize of pretty much no financial value.
I’m going to start with a graph of the flange-to-sensor measurements of 478 Canon photo cameras of various types, ranging from Canon 1DxII to crop sensor bodies. As we did with the Cine tests, the first thing we looked for were outliers. The four circled in red are pretty apparent outliers. The seven circled in yellow are borderline, although they seem close to the expected range.
Before we get into the outliers, let’s look at the same cameras by type, this time with the outliers removed. Not a lot to see here; certainly no distinct differences by camera type. I checked the age/use of each type also;, they all averaged slightly over 110 days of rental. I also removed the Canon 5D MkIII from this second graph to keep it cleaner; there were only a dozen of them; they are all older (200 days rental average), and they all measured between 0.0 and 0.02.
So mild surprise #1; flange-to-sensor distance in photo cameras doesn’t vary any more than Cine cameras.
Now let’s discuss those outlier cameras a bit. I had all of the red and yellow circled cameras in the first graph pulled, inspected by both camera and repair techs, and evaluated their complaint history. I didn’t expect to find much because I genuinely expected there would be no significant problems if an SLR mount varied a bit.
The first real surprise I got was which cameras were pulled because of variation. We pulled 11 cameras out of 478 (that’s 2.3%, about the same as Cine cameras). All 11 of them were either 5D IV or 1Dx bodies, which is not at all what I expected. There were 291 total 5D IV and 1Dx bodies, so 3.75% of those bodies were out of spec, none of the other Canon cameras were. If, when we started this, you thought that ‘yeah all the outliers will be from the most expensive camera types,’ then you’re way smarter than me.
In evaluating these eleven cameras, we found that two cameras that measured -0.03 and two that measured +0.04 had no significant history and functioned perfectly; they were returned to stock.
That left us seven other cameras with some significant history; four 5DIV and three 1Dx bodies. Three of the 5D IV cameras had been reported (by the customer) to have taken a significant drop with a lens attached, had been tested by us, considered fine, and returned to stock. The other one had had a sensor assembly replacement a year ago for a scratched sensor.
One of the 1Dx bodies had had a mirror box replacement after a bad fall with a lens attached. The other two had no clear history of problems, but both were older, and both had notes in the past that they had come back from rental with a dent or significant cosmetic damage (in other words, probably a drop the customer didn’t report). Both of these also required significant AF microadjustment, 15 or so. In both cases, when we replaced the lens mount the cameras went back in spec, AND they no longer required the large micro-adjustment. It’s only two cameras, but it’s an interesting finding.
So, while this isn’t a huge number of cameras, a history of drop and/or significant internal work is present in nearly all of them. So I speculate that these larger cameras (which are more likely to be mounted to larger lenses and perhaps are more likely to be used in rough conditions), don’t take a drop well. That makes sense because of force proportional to mass and all.
It’s interesting to note that while all of these cameras functioned OK (they passed our routine inspection, which is pretty thorough), there were some subtle complaints. I mentioned the two 1Dx requiring more AF microadjustment than we like. One of the 5DIV cameras had a complaint that it seemed to hunt on autofocus a bit. I’m not sure if this is significant, we couldn’t really reproduce it. But we’re indoors with good lighting in the testing area; customers use cameras in other conditions.
This was not, at all, what I expected. I was a little surprised that photo cameras didn’t vary more than cine cameras. The fact that ‘dropped with a lens attached’ affected the mount does seem pretty obvious, in retrospect. The fact that it appears to affect larger camera bodies that are likely to be attached to larger lenses makes sense, too. A camera body may have ruggedized features, but a lens mount is a lens mount. It just wasn’t what I expected. Things seldom are.
Even the way-out-of-spec cameras seemed to have functioned well, though, with only subtle and hard-to-reproduce issues. So, the ‘it doesn’t matter on photo cameras’ school of thought is technically correct. Still, I’d rather have a camera that wasn’t an outlier, so checking the flange-to-sensor distance on photo cameras seems like a good idea going forward.
The AF microadjustment part is interesting, but it’s just two cameras, so I don’t want to draw any sweeping conclusions. That’s not going to stop the internet for drawing sweeping conclusions, I know, but I can’t help that. When you see someone reposting this with a click-bait headline, please remember that wasn’t my headline.
Sony Mirrorless Bodies
We tested 487 Sony mirrorless bodies of various types. First, let me mention that we asked and answered the obvious question, “does the IBIS system change the flange-to-sensor distance.” We took some cameras, measured them, put a lens on, focused it on various things to run the IBIS, and measured again, about a dozen times each. Each camera had identical measurements every time.
The overall graph for the Sony bodies exhibits more variation. Note that the range for this chart is -0.1mm to +0.1mm, larger than that of the first Canon graph above.
I’m going to separate out the camera types in a couple of graphs; there’s a lot of them, and showing too many at once gets confusing. First, let’s go with the two groups at the opposite ends of the price spectrum; the A9s and the crop-sensor cameras (A6xxx). Why those two, you ask? Because of their awesomeness. They look marvelous. (I’m keeping the range the same in the next few graphs for perspective.) Let me note that the A9 bodies had lower use than most other camera bodies, averaging 60 rental days each.
Next, I grouped the A7III, A7R IV, and A7rIII cameras. The A7rIII and A7III bodies averaged over 110 rental days (similar to the Canon cameras above), while the A7IV average was significantly lower, at 56 days. These are a bit less marvelous from the variation standpoint.
The average reading for all three cameras is similar; it’s the outliers that seem dramatic.
The a7sII cameras were quite similar to the other A7 cameras; I’m just graphing them separately to keep things legible. These also tended to be higher use, at 190 days average.
I’m going to talk about all the outliers first because that was what stood out. We pulled 28 cameras for repair department evaluation (5.75%). As with the Canon cameras, all had passed inspections. None of these had a clear history of being dropped. (Maybe Sony users are more careful. Maybe lighter cameras drop better. I don’t know. I doubt you do, either. But I’m sure someone is going to start a thread with “Cicala shows Sony users never drop cameras.”)
A couple had one or two ‘iffy’ complaints, either that images seemed a little soft or the stabilization wasn’t as good as expected, but most had no complaints at all. (To be clear, about one in 20 rentals or so has a ‘maybe something’ complaint, about 90% of the time we can’t reproduce it, and it goes on to rent with no further complaints.)
When repair inspected them, there were a few that had loose lens mount screws and a couple that improved when the lens mount was replaced, so we assume the mount was bent slightly. The ones that were large outliers, though, had a more unexpected issue; there was a fracture between the sensor mount and the stabilizing system. You can look at this old teardown of the A7rIII that shows the sensor is mounted to a plastic plate that attaches with three screws to the in-body image stabilizing system.
When we took apart these way-out-of-spec bodies, we found in several cases that the plastic plate had fractured. Two of these cameras are shown in images below. The dark black area to the right of the arrow is the back of the shutter. The large metal assembly towards the bottom left of both images is the IBIS system. The sensor is hidden in front of it, but you can see the sensor-to-IBIS mounts are broken where the arrows point.
You don’t have a scale to go by, but the broken pieces are shifted by 0.5mm or so. Of course, the center of the sensor, where we measured, is shifted less than that, but several readings were in the +.15 or -0.18 range (I cut the charts off at +/- 0.01 above).
On two others, one of the screws holding the sensor in place had backed out.
There was another one where with a metal fatigue type fracture in the mount.
And one that had a displaced fracture of the sensor frame (above and to the left of the arrow) severe enough to pop off the retaining clip (arrow point).
The amazing thing to me is despite what I would have thought was disabling damage, the cameras really didn’t show much disfunction. If you had just shown me the pictures above, I would have expected error messages, horrible images, something dramatic. These were regularly renting; customers were happy with them; our 64-point tech inspection was passed before and after each rental.
When we KNEW something was wrong and did hours of stress testing by our most experienced techs, all we came up with were the same things that a couple of customers had said. “Seems images might be a little soft on one side,” or “maybe the stabilization isn’t quite as good as it should be.” Only when we opened them up did we find the problem.
In each case, only one of the plastic mounts was broken, the other two mounts were fine. I suspect two broken mounts would be dramatically bad and probably would have just been sent to the service center. (We only do limited Sony work in-house, so we would have just sent them to factory service once we detected IBIS or sensor problems). One broken mount is apparently pretty subtle, though, at least most of the time.
We bought this tool to check back focus distance on cinema cameras. We find it’s an extremely sensitive tool to detect sensor mount damage we didn’t know existed. Just like with the Canon cameras, you can make an argument that we’re fixing a problem that isn’t a problem; the cameras were working. And I’ll give you the same response. That’s great, but I’d want the camera I was using to not have a broken mount.
When the dust settled, we found that actual lens mount problems occurred in under 2% of our Sony camera stock, about the same rate as our Canon cameras. In the early days of Sony mirrorless, there were problems with the lens mount; that has obviously been fixed. Eight other Sony cameras (1.6% of all Sony or 2.8% of A7xxx cameras) had a sensor mount issue.
Micro 4/3 Cameras
We don’t have nearly as many m4/3 cameras as we do the others. We tested 138 of them; 82 Panasonic, 36 Olympus, and 20 Black Magic Pocket 4k.
First, please note the graph range is different again, in this graph from -0.02 to +.013. Also, I should mention that the average age of the m4/3 cameras was about 130 days of use.
The average m4/3 camera measures at about +0.06 on the Denz optical test. Since 19.25mm is the nominal distance, the Denz is saying they actually average 19.19mm on average. There are several possible explanations for this, including manufacturer preference, the effect of thicker sensor cover glass on our measurements, and things I haven’t thought of. I don’t know the answer (yet); at this point, I was mostly looking for outliers.
We considered four cameras as certain outliers (-.02; -.01; +.13) and the two at 0.0 and 0.11 as possible outliers. As always, we did stress testing on those five cameras and reviewed their history. All had no complaints. All seemed completely fine on testing and even with a peek at the sensor assembly. Interestingly, four of these five had had a sensor replacement at factory service in the past, and the other a shutter replacement. (To give perspective, 5 of the other 133 cameras had also had sensor replacements and measured fine.)
My conclusion is that the factory service center isn’t quite as good at spacing sensors as the assembly line is, but it doesn’t seem to matter at all.
What Did We Learn Today?
Most importantly, we learned Roger’s Formula of Internet Wrongness and that it should be called a Roger because Roger makes so many wrong assumptions. (I will give myself credit for being one of about three people on the internet will to say, ‘wait, I was wrong.’) Also, sometimes measuring things nobody else has measured is interesting.
We also learned that using a tool to measure flange-to-sensor distance is an excellent thing to do for photo cameras, not just video cameras, which was a surprise. We found fixable issues in about 2% of our photo camera fleet. I’m a repair and quality assurance person. To me, that’s a huge thing. Huge enough that we need to get a second Denz tool for the photo techs since the first one is already monopolized by the video people. (I tried getting the photo techs to sing “Oh lord, won’t you buy me a Mercedes Denz” in a flash mob, but that didn’t happen.)
Second, we learned that while slightly over 2% of photo cameras had issues, they still worked well. So that first assumption, that flange-to-sensor distance of photo cameras doesn’t matter, is largely true. We knew it wouldn’t affect infinity focus since photo lenses will all focus past infinity, with few exceptions.
Even broken sensor mounts don’t seem to matter a whole lot. However, there may be some subtle problems. AF microadjustment on an SLR may be increased; there may be some subtle image softness or mild IBIS dysfunction. But you have to look hard and even then finding it is iffy.
Testing flange-to-sensor distance lets us find 2-3% of our cameras had something broke that was causing, at most, some mild symptoms. Logic tells me if these things were not fixed, they would eventually get worse. So, I think this is a sound ‘early detection’ system.
Let’s remember what the sample pool was, hundreds of cameras from a large rental fleet. That’s a two-edged used-camera sword. These are heavily used rental cameras, probably more heavily used than your own camera, and more likely to be damaged, and the sensors are certainly cleaned more often.
But, they are also regularly tested and screened by experienced people with a lot of equipment you don’t have lying around the house. So, significant problems would have been weeded out before we did this test. If AF adjustment was at absolute maximum, for example, the camera would already have been repaired. Same if the IBIS had a major dysfunction or if the images looked bad. So it’s possible that this does cause apparent problems sometimes, but those cameras had already been fixed.
What does it mean for you and your personal camera? Well, I’d suggest if you dropped a camera with a lens on it, you consider there might be subtle damage after the initial ‘oh, thank goodness it’s OK’ moment passes. If you have AF microadjustment, you might check to see if it has changed. If it has changed, you might want to send it in for servicing. (I should mention that most SLR cameras benefit from a bit of AF microadjustment; that’s normal. A camera requiring lots of AF microadjustment on most of your lenses might indicate a problem.)
That 2% of A7 series cameras had fractured sensor-to-IBIS mounts is a bit scarier, I know. This seems to be an only full-frame camera issue, which makes sense. We saw no instances on m4/3 or crop cameras with IBIS fractures. That doesn’t mean it never happens, but it probably happens less much frequently. The IBIS units for full-frame cameras are much bigger than those for crop cameras. Bigger means more mass and more moving mass, just like dropping more mass, means more force.
We have noted over the years that manufacturers are making IBIS units larger and sturdier. Looking at where most of these fractures occurred, I would guess maybe the sensor-to-IBIS mounts, which haven’t changed much, might need to get a bit sturdier on the next generation.
But that’s just my guess and we see in this article how frequently my educated guesses are totally wrong. It’s possible it’s just a rental camera thing; that heavier use and more frequent cleaning cause the issue.
Whatever the cause, I wouldn’t avoid getting an IBIS camera because someday this might occur. IBIS is awesome, and if it’s going to fail someday, well, it’s still awesome until that day. Someday the shutter is going to fail, too, and we don’t avoid cameras with shutters. Eventually, cameras and lenses fail; it’s what they do. Finding out they’re going to fail before they totally fail, though, that’s worthwhile.
Addendum: It’s not often I find myself outgeeked, but I have. This article is an incredibly good (it’s long, but worth it) read on sensor tilt. I recommend it.
Roger Cicala, Aaron Closz, and Ben Berggren
Thanks to Will Glynn for technical assistance and Erick Marquez for repair department photographs.
I’m just putting this down here so it will show up above the comments.
Author: Roger Cicala
I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.