Once upon a time I wrote an article attempting to explain phase-detection autofocus and its limitations. That was a lot like trying to explain what happened to the Mary Celeste or why Stonehenge was built.
Actually we probably have a better handle on the Stonehenge thing than we do on the Mary Celeste or phase-detection autofocus. So rather than rehashing and speculating, I thought I’d instead do some simple demonstrations and explorations.
Obviously there’s a lot of ground to cover, so rather than making a 162-page blog post, I thought I’d break this up a bit. This first part will be very simple. We’ll compare the accuracy of phase-detection AF, contrast-detection AF, and Roger-detection MF (that would be me manually focusing using 10X LiveView) on still targets using just center point AF. While we’re here, we’ll demonstrate just what AF Microadjustment does, and doesn’t, accomplish.
We use Imatest to demonstrate AF variation using Canon full-frame cameras and 50mm f/1.4 and f/1.2L lenses. The reasons are pretty straightforward: Our setup involves shooting a still test target with star, vertical, and horizontal focusing aids in the center from a tripod, so we should be getting very accurate AF. The setup we’re using involves shooting at a 12-foot distance so the depth of field is narrow. If focus is off by just a bit, Imatest will detect the decrease in resolution.
Phase-Detection AF Accuracy with an Older Design
We started with a Canon 50mm f/1.4 lens mounted to a 5D Mk II camera. The first thing we did (deliberately a bit out of focus so it would show up easily) was set the lens on manual focus and take eight repeated images without touching anything. This gave us a baseline (represented by blue diamonds) of the variation in testing procedures if nothing else changed.
Then we took eight repeated shots using LiveView and manual focus (represented by red squares), spinning the focus ring between each shot to either infinity or absolute close up. The red squares demonstrate Roger Units (how well Roger can manually focus given all the time in the world on a tripod with a perfect test target.)
Finally we did the same thing but let the camera autofocus in LiveView (represented by green triangles), resulting in contrast-detection autofocus. In theory this should be as accurate as Roger is, perhaps more so.
The graph below shows the results of those shots. For those of you who are not familiar with our Imatest graphs, the numbers reflect the sharpness of the image in Line Pairs / Image Height. The sharpness in the center is shown on the X-axis and average sharpness on the Y. Higher is better and, in this test, better focus equals higher sharpness.
The graph shows that there’s a little bit of variation when we take several shots without changing anything (blue diamonds) which probably reflects tiny movements from mirror slap, minute changes in lighting, or possibly even tiny fluctuations in sensor performance. Remember the blue shots are purposefully a bit out of focus so they’d show up separately in the graph.
When we tried to manually focus accurately (red boxes) or let the camera do it with contrast detection there is a bit more shot-to-shot variation. But the variation is still small, as is the difference between the camera and Roger. To put some numbers to it, the standard deviation of the baseline shots was 5.5; for the Roger-focus shots 11.5; and for the LiveView AF shots 8.4.
Now let’s throw away the repeated shot results from above since we’ve made that point, and replace them with standard (phase-detection) autofocus shots. These are taken in exactly the same way as the live view AF shots: take the image, spin the focus ring to one extreme, let the camera refocus, save the image.
Two things become obvious:
1) When phase detection AF gets it right, it’s every bit as in focus as contrast detection or Roger detection.
2) Four of these 10 shots aren’t quite as accurate as the other six. That sounds like a ridiculously high miss rate but let’s put the numbers in a bit of perspective.
If we believe in Subjective Quality Factor (SQF) then an SQF difference of about five is needed to see a difference in a reasonable print or pixel peeping at 50 percent, given that the two lowest blue diamonds (phase AF shots) are going to show up as a bit soft or out of focus if we’re really critical. The third lowest might or might not look different from the “good shots”–it’s right on the edge.
Still, that gives us a “missed focus” rate of 20 percent in this little test, and that’s using center point AF with a still target. That’s not a big sample size obviously, and we repeated it with several other 50mm f/1.4 lenses and got “missed focus” rates of between 10 and 20 percent for all of them.
That’s not horribly out of focus, but it’s definitely pixel-peeping out of focus. For those of you who like numbers, the standard deviation for phase-detection AF ranged from 25 to 44 on different runs (always greater than the 5 to 15 we got with LiveView AF.)
Demonstrating Microfocus Adjustment
Next on our trial list, we thought we’d compare with a newer designed lens. The Canon 50mm f/1.4 doesn’t have a real USM motor and is known to be a bit difficult to focus. Not wanting to make the test too easy on the camera (and because we were set up at 50mm), we grabbed a few Canon 50mm f/1.2 lenses. These lenses have a modern design but again are known to be a bit “focus challenged”, to use the politically correct term.
One of the first copies we shot gave us the opportunity to demonstrate microfocus adjustment. We took our usual LiveView autofocus and manual focus charts, followed by a set of phase detection autofocus shots. When we graphed the results, we had to change the axis range because the phase-detection AF shots were so out of focus. Some of the shots still don’t show up; they’re even worse than the expanded axis shows. Unlike the above example, every single one of this is obviously out of focus at a glance–no pixel peeping necessary.
What happened was readily obvious: This copy of the 50mm f/1.2 was backfocusing badly on this camera. We did a quick microfocus adjustment (12 points) and reshot the lens with the results below.
Here is a superb example of what microfocus adjustment accomplishes. After adjustment phase detection AF is now very accurate, although there still is going to be shot-to-shot variation. For the two or three of you who like to scream, “I don’t want to us microfocus adjustment! The lens should be perfect out of the box!“, we went ahead and put the same lens on a different body with no microadjustment.
On camera No. 2 the same lens autofocuses accurately. The other 50mm f/1.2 lenses we tested all autofocused accurately on the first camera. The lens is perfect, just not with the first camera. The first camera is fine with all the other lenses we tested. Sometimes a given lens that’s fine doesn’t match up with a given camera that’s fine. So it goes.
The conclusion is pretty obvious: If you want to shoot wide aperture prime lenses and you don’t want to use microfocus adjustment, you just refuse to cope with reality.
We did several runs with 50mm f/1.2 lenses and found similar results all the way through. Standard deviations for sets using LiveView AF were 11 to 20, while phase detection ran from 20 to 35. These results are similar to what we saw with the 50mm f/1.4.
This really wasn’t a “conclusion” post; our intention was to just demonstrate some autofocus basics that most people know already.
- LiveView (contrast-detection) AF on a still target is more accurate than phase-detection AF. It should be so. Contrast detection is using the actual sensor to determine focus; phase detection is not. Overall we found about one shot in 10 was out of focus with phase detection.
- LiveView AF is as accurate as Roger View MF. You may be better than this, or you may not.
- Phase-detection AF has more shot-to-shot variation than contrast detection. It’s not huge, but it’s real. This shouldn’t surprise anyone. Phase detection was developed for fast AF speed and to detect subject movement. It wasn’t developed to be more accurate.
- Microfocus adjustment pulls good phase-detection AF results up to a par with LiveView, but it doesn’t eliminate the small amount of shot-to-shot variation that phase-detection AF has.
I’ll be expanding on this first article in some subsequent posts. Next I will be comparing the “two beep” AF method (where you push the shutter halfway down to focus then repeat before taking the shot) to the “single beep” method we used here. (I think the “two beep” method is more accurate for no reason other than when I started people told me it was so. We shall see.)
We’ll certainly do some comparisons with different lenses on the Canon cameras and try to see if newer designs, STM, or other lens changes reduce the AF variation. Obviously we’ll have to get a larger database to be able to detect subtle changes, but that’s just a matter of repetition. (What we showed today were just examples. We’ve done a larger number of these lenses already.)
We will also look at different camera systems’ AF to see if they seem to have less phase-detection variation. I had thought about comparing contrast-detection systems in other cameras, but it’s obvious we’ll struggle to find one that’s better than Canon’s.
I’m not saying Canon’s is the best, but it’s as good as what I can do, so it may be difficult to detect “better” in this situation. We might be able to detect worse, of course.
Finally, within limits, I’m asking for suggestions about other things we can look at.
Remember when you make suggestions, that this model requires a fairly narrow depth of field to detect differences. Very long focal lengths (over 200mm full-frame equivalent) test at longer distances which may increase depth of field too much for us to detect small differences. Narrow maximum apertures (f/5.6 or so) would have the same effect.
Given those limitations, though, I’m open to suggestions!