We all know that aberrations affect points of light off-center, making them blurred. We all know that some aberrations are worse the farther away from center we go. And we know that some aberrations are improved when we reduce the aperture.
Some of us even know the various rules-of-thumb for what makes an aberration worse or better. Astigmatism, field curvature, and distortion get exponentially worse as you move away from the center of the lens. Coma and lateral chromatic aberration get worse away from center, but not exponentially, and spherical aberration isn’t worsened at all as you move away from center. Reducing the aperture dramatically improves spherical aberration and coma, reduces field curvature and astigmatism to a lesser degree, but doesn’t have much effect at all on distortion or lateral chromatic aberration.
It gets pretty complex, doesn’t it? And since different lenses have very different amounts of the various aberrations, none of us really have any idea exactly how much improvement to expect when we stop down a bit. We do some trial and error (well, most of us do) and decide where the “sweet spot” for a given lens is. I know to shoot my Zeiss 50mm f/1.4 at f/5.6 if I want sharp images away from the center, for example, just because I’ve played with it and figured that out. On the other hand, I can shoot my Sigma 35mm at f/1.4 — it doesn’t really seem to get much sharper at f/2.8.
Since we’ve been using OLAF to look at how lenses render points of light off-axis, I thought it might make a fun demonstration to see how moving across the field of view affects how the lens “sees” a point of light, and how stopping down improves it.
The Sum of Aberrations
I chose 35mm lenses for this demonstration, mostly because I shoot at that focal length a lot, so I was interested in comparing those lenses. For those of you who haven’t followed the Geeky articles about OLAF, here’s a quick summary of what you’ll be looking at. We’re shining a 5 micron pinhole of light through the lens, then using a camera and collimator to see what that pinhole would look like to our camera. For any decent lens, if you do that right in the center, you see the image of a pinhole with some halo of light (basically, the lens’s point spread function) surrounding it. Not too surprising so far.
But as we start moving the camera off-center, the various aberrations of the lens become more evident, and our point starts looking less and less like a point. The 35mm lenses I’m playing with today have a field of view (the angle from one edge to the other) of about 62 degrees, so I can “see” roughly 30 degrees to either side of the center. The best resolving 35mm lens is the Sigma 35mm f/1.4 Art, so we’ll use that for an example. Here’s what that point of view looks like as we move from center (above) through 12 degrees to 30 degrees off-center (below).
Remember, OLAF is designed to make aberrations look bad — its purpose is to adjust bad lenses, so the more it shows us the better we can do that. Also, modern lenses are designed expecting that light will pass through cover glass, an IR filter, a Bayer array, and the sensor’s microlenses; all of which refract the light further, so the camera probably doesn’t “see” quite this degree of aberration. But this lens is as good as you can get at 35mm focal length and f/1.4 aperture.
What Happens When We Stop Down?
For aberrations, at least, good things should happen off-axis when we stop down. As mentioned above, spherical aberration and coma should improve a lot, and astigmatism should improve some. (Field curvature doesn’t count in this demonstration because I’ve focused on the off-axis points, not the center point.) I mentioned earlier, too, that different lenses have different aberrations contributing different amounts, so let’s put up a comparison of several 35mm lenses. I’ll just show the 30 degree points for each, since that’s the farthest from center where all of the lenses are at their worst.
A couple of points about the images above. First, in order to save some space and make the illustration nice and square, I started each lens at its widest aperture then moved to f/2.8, so the f/2 lenses have a bit of theoretic advantage on the widest aperture line. Second, OLAF’s cameras were set on automatic and made some brightness adjustments on their own, so you can’t judge relative brightness (or amount of vignetting) from the above images.
This isn’t a test of “best-lens-at-the-edge-of-the-image,” although I do think we can say it’s a reasonable test of the worst lens at the edge of the image. There are no surprises here. The Canon 35mm f/2 is a cheap, 20+ year-0ld design. If there’s a surprise, I think it’s how good the Canon 35mm f/2 IS points look off-axis. We knew the other three were all great lenses.
I do think it’s a nice illustration, though, of the way differently designed lenses have slightly different effects as you stop down. It’s also interesting that the designers of the different lenses have chosen slightly different aberrations (aberration-free is not possible). This gives slightly different patterns to the off-axis points, and closing the aperture down has slightly different effects.
You Want to See a Zoom?
Zoom lenses are way, way more complex than primes. With most of the zooms, the-off axis points have much more complex aberrations than these. But the Canon 24-70 f/2.8 Mk II is widely acclaimed to be “nearly as good as a prime” by lots of people (including me). So I thought, since it happens to work at 35mm, we might repeat the test with it to compare to the 35mm primes above. So here is the Canon 24-70 f/2.8 Mk II at 35mm, also 30 degrees off-axis, at various apertures.
The f/2.8 image gives you an idea of the increased complexity of aberrations with a zoom lens, but the bottom line is that even at f/2.8, at the very edge of the field of view, it still makes a reasonable sharp point. Stopped down it gets even clearer. Looking at just this one point, I have to say the Canon zoom at f/2.8 is pretty impressive. If you don’t believe me, below are 4 other zoom lenses shot at 35mm and f/2.8.
It’s Nothing You Don’t Already Know.
At least it shouldn’t be. But sometimes seeing the image is a lot better than interpreting all the geeky numbers we usually generate. But you number geeks don’t need to worry; it’s only a matter of time before we get software to generate MTF numbers from these point spread images.
Roger Cicala and Aaron Closz