Ten years ago, if you wanted to try out some photography equipment, if you lived in a large market, your local camera store would have a few beat up copies of popular lenses for rent (with a 100% deposit). For the rest of us, we didn’t even have that option. I had this great idea to start an online rental offering, no deposits necessary and shared my idea with people I knew. Almost everyone said I would get robbed blind and lose every dime I had. Almost everyone said you’d get junky, beat-up rental equipment and were wasting your money renting online. Almost everyone said that my idea would be a massive failure.
I say ‘almost everyone’ because a few other people thought it was a good idea, too. You guys, our customers, thought it was a good idea. We’d never met each other unless emails count as a meeting. But we trusted each other because we all wanted this to work. Because we few thought that getting to use equipment for a few days or weeks at a reasonable price just made sense.
Almost everyone turned out to be dead wrong and we few turned out to be right. Lensrentals thrived beyond anyone’s wildest dreams. Sure, I took risks, and the people who joined me here worked their butts off. But you guys, our customers, were our partners in proving ‘almost everyone’ wrong. Without you, it never would have happened.
Ten years later, saying thank you just isn’t adequate. There are no words that would possibly express my gratitude for all of you who supported Lensrentals and created our success; you folks who shared in proving ‘almost everyone’ wrong.
There are no words, but I believe actions are more important than words. Everyone who works here tries to show our gratitude in our actions. Whether it’s making all of our testing data public, making sure your rental arrives in better condition than you expected, drawing a dinosaur on your shipping box because you requested it, or just talking you through a difficult set-up on the phone, we want to show you our gratitude with every rental. We want you to know it’s more than just business. It’s a partnership between you and us. You’ve helped us achieve our goals; we want to make certain we help you achieve yours.
We wouldn’t be doing what we love to do every day without you. We want our actions, our attitude, and our service let you know, every time you rent from us, that we are grateful that you have partnered with us along this journey.
It’s been known for some time that wide angle M mount lenses have issues on Sony mirrorless cameras because of the way the sensors are designed, but there’s never been a very good list of what lenses work best, much less a visual reference to see how bad things can really be. So I took it upon myself to fix that. This isn’t a comprehensive list, but it’s at least all the M mount lenses we stock that are shorter than 35mm. This includes Leica, Voigtländer, and Zeiss lenses in our inventory. Below you’ll find images I took with a Leica Typ 240 body and a Sony a7R II body with Voigtländer M mount to E mount adapter. To keep things compact, I’ve cut the corners and centers out of test chart images to show the impact on corner sharpness and/or color shift that the Sony sensor has, compared to how the lens is meant to perform on Leica bodies. In each sample, you’ll see the Leica image on the left and Sony on the right, with wide open performance on top and stopped down to f/8 on the bottom. Because of the higher resolution on the Sony a7R II (7952×5304), I resized the original images to match the M Typ 240 (5952×3968) before cropping for convenience in showing side by side images. The effects of the sensor stack are still plainly visible even at the slightly reduced resolution.
To draw some quick conclusions from the above images, every lens we stock is affected to some degree, but there are at least a few I would consider still very usable wide open. The Voigtländer 15mm f/4.5 Heliar III, Leica 16-18-21mm f/4 Tri-Elmar, Leica 21mm f/1.4 Summilux-M ASPH, Leica 28mm f/1.4 Summilux-M ASPH, and Leica 28mm f/2.8 ASPH Elmarit-M II are all what I would consider good enough at maximum aperture, with the Tri-Elmar being the absolute best of the bunch, and the Summiluxes very close behind. Almost all of the tested lenses improve corner performance when stopped down to about f/8, where a lot of these lenses really shine for landscape work anyways. One curious thing to note, though, is that often when stopping down, you can improve corner performance by adjusting your focus so that the center isn’t at its sharpest, but still within the depth of field. Check out this example from the Zeiss 15mm f/2.8 at f/8:
On the left is the corner and center when I focused on the center, and on the right is the corner and center when I focused on the corner. There’s negligible difference in the center, but a world of difference in the corners. This is at chart distances of only three or four feet, so at infinity, the results may not be as positive. That’s something to keep in mind for all of these lenses. At or near infinity corner performance on Sony bodies will diminish some as the rear elements of these lenses get closer to the sensor. The Tri-Elmar and 21 and 28 Summilux are still top performers, even at infinity, though, and they’re my regular go-to lenses for wide work on Sony bodies.
So is it worth it to even try adapting these lenses? Well, with the still limited lens options available for Sony a7 bodies these days, I would say…maybe. New E-mount options seem to be coming out all the time, like the Voigtländer 10mm f/5.6 Hyper-Wide Heliar and 15mm f/4.5 Super-Wide Heliar, as well as the Zeiss Batis 18mm f/2.8 and 25mm f/2. These are great options, and much more reasonably priced compared to offerings from Leica. If you already have an M mount system and are looking to try Sony, this list I’ve made is meant for you. If you’re a Sony owner looking to find some good wide angle alternatives, now you have a visual reference for your best options. And let’s be honest, nothing looks as good as that 21mm Summilux, and for $7400, nothing else really should.
Part of my job here at Lensrentals is reviewing and building large video-centric orders before they’re shipped. To understand why this is important, it’s important to first understand a little bit about how our equipment inspection process works. Upon receiving an order back from a rental customer, each individual item is placed on a shelf with other like items. All of our Canon 5Ds, for instance, are stored and inspected together, same with all of our lenses, cables, lights, etc. Everything is checked on an individual basis to ensure that it’s performing up to our expectations. Once these items are inspected, they’re moved to a shelf location in our warehouse, where they wait to be rented, cleaned, and shipped.
This is the most efficient way to test all of our equipment quickly and thoroughly, but it does leave open the possibility for compatibility issues to be present in orders. A monitor, for instance, might work just fine under our testing conditions but, for whatever reason, end up not working well with a specific camera or cable. It’s also possible for orders, especially the large video ones that I work with, to be placed without an essential item like recording media. That’s where my spot in our assembly line becomes important. Orders with certain camera bodies, an especially high number of items, or pieces of equipment that need hands-on setup before they can ship come to me. I build everything in the order as if I were going to shoot with it, and then call the customer if there are any compatibility problems or possible missing items. I’ve been doing this for a while now, and I’ve noticed a pattern in the kinds of problems I tend to catch, so here are 6 things I think you should consider before placing your rental.
Media and Readers
Since so many of our customers prefer to purchase their own recording media, it’s not included with any of our cameras in order to keep rental costs down. So it’s pretty easy, especially if you’re a first-time customer, to put together an order, forget to include a CF card or SSD, and then find yourself with a very expensive camera that you can’t actually record any video with. Same goes for card readers. We have available at least one type of reader for every card we carry but, in most cases, they’re not included with camera rentals because we don’t want to have to build in the cost for people who have their own. While many of our video cameras include USB cables and are able to transfer footage to a computer, I’d always recommend a standalone reader because they’re generally faster and more reliable.
Batteries and Chargers
With very few exceptions, every device we carry that uses a battery will include one battery and one charger. If you’re unsure whether the item you’re renting includes a battery, check under the “Includes” tab of the product page. Under the “Specifications” tab, we also list typical battery life for each product. While this can vary based on usage factors, it’s generally a good tool to estimate how many batteries you’ll need for your shoot. My advice is to play it safe and always order more batteries than you think you’ll need. They’re generally pretty affordable, and it’s always better to have them and not need them than need them and not have them. A significant exception to the “one battery, one charger” rule of thumb is anything that takes disposable batteries: AA, AAA, 9-volt batteries, basically anything you would throw away when it’s dead. Make sure you have some of your own on hand, and if you’re confused about whether a certain item will require disposable batteries feel free to contact us.
If you feel like you’re sensing a pattern here, you’re right. It’s easy to put an order together and have small but essential accessories slip your mind. Cables are no exception. None of our monitors or recorders include HDMI or SDI cables. In this case not only because it would be an unnecessary cost for people who own them, but we also have no way of knowing what kind of cables you need. We stock a ton of different sizes, lengths, and brands of HDMI and SDI cables for various uses. If you have questions about what type of cable will work best with your camera, or what cable length you’ll need for your support setup, just give us a call and we’ll be happy to help.
Now we’re branching out a bit from things people typically forget to the inherent limitations of renting equipment without having used or set it up yourself. If you’re, say, renting a shoulder mount kit and have concerns about ergonomics, let us know and we’ll do everything we can to give you a better idea of whether or not a certain product will work for you. We do our best to take informative and accurate product pictures, but there’s only so much you can tell about a product from a couple still photos, especially if you’re combining multiple accessories. Not sure about whether a shoulder mount will allow you to see the camera monitor? Whether the included rods are long enough to mount a matte box? Whether or not you’ll need a VCT plate? Let us know, and we’ll gladly set everything up and answer any questions you have.
Gimbal Weight Distribution
I know this sounds like a pretty specific compatibility issue for a list of the most frequent problems we catch, but it’s surprisingly common. We end up calling at least one customer per day about weight distribution on handheld gimbals. The DJI Ronin specs, for example, list a weight capacity of 15 pounds, so people who haven’t worked with that product before tend to think that anything weighing 15 pounds or less will fit. The problem is, depending on your lens choices, accessories, camera body length, and battery weight, your setup may be so front heavy that you can’t move the camera body back far enough in the camera cage to compensate. We carry the extended-arm DJI Ronin specifically to combat this issue. If you’re not sure whether your setup will work with the gimbal on your order, just give us a call, or request a check in the “Special Instructions” section of your rental form.
This last one is pretty general, I guess, but it’s probably the most important. One of the advantages of renting equipment rather than purchasing it is that it can broaden your horizon. You have the ability to try things you wouldn’t have without access to more than just a camera and a couple lenses. The disadvantage, of course, is that every piece of equipment you add to your setup increases the chances of something going wrong or being forgotten. We include as much information as possible on our product pages, but if you’re trying something you’ve never done before, there’s really no way to know it’ll work without setting it up. People call us all the time with dreams of building Rube Goldberg-esque contraptions of switchers, HDMI to SDI adapters, recorders, wireless transmitters, batteries, extension cables, streaming boxes, all of which can either work flawlessly or halt a whole production depending on what else they’re working with. We’re always happy (honestly, it’s fun) to set up whatever we can in the office and let you know whether or not it works.
The common thread through all of these recommendations is to contact us if you have any concerns at all about whether you’re missing any important equipment or whether the equipment you do have will perform the way you need it to when it arrives. You can call, email, or add notes to the “Special Instructions” box when placing your order. Either way, a video or photo tech will gladly answer your questions or just review your order for missing items.
We’ve put all of this information out in other articles, but it was as we finished each one so it’s become scattered about. I wanted to simply put up all of our MTF and variance curves for Sony FE lenses in one place so you wouldn’t have to hunt and search for different articles to get the information you wanted. So there’s nothing new here, and very little commentary, just a single place where you can go to compare FE lenses.
Remember that each lens was tested at it’s widest aperture. The Sony FE 35mm f/1.4 has lower MTF than the Sony FE 35mm f/2.8, for example, because they were tested at different apertures, not because the f/2.8 is a sharper lens.
I also want to point out that the new variation curves we’re using now tend to minimize the difference between lenses a bit compared to the old ones. We think that’s appropriate since people were really splitting hairs with them before. But with these, when you notice a difference, it’s a pretty real difference.
But your other question, why has it taken so long, has a more complex answer. Testing FE lenses are much more difficult than testing standard photo lenses. The electronic focusing mechanisms required us to make some major modifications to our testing bench. The baffles in FE mounts clips some data on the edges, making the testing process more difficult. FE lenses, having a short backfocus distance are more sensitive to the amount of optical glass in the imaging pathway. And, most recently, distortion curves and copy variation among some of these lenses made us question our results and testing methods, sending us back to redo a lot of data and taking up a couple of months of our time.
The end of all that is in sight, and we found, after a lot of internal retesting and questioning that the original results we got for these two lenses several months ago are valid and we’re publishing them now. During this process, we’ve also changed our variation graphs somewhat so they’re going to look a little different this time. That’s simply because these are new and better. We’ll be changing all of our other lenses to the new graphs eventually. (Don’t worry, I’ll have a follow-up article next week, putting the results of all the FE lenses in one place using the same graphs.)
Finally, no, we still aren’t quite done determining a ‘variation’ or ‘consistency’ number formula that we’re completely happy with, so there won’t be numbers for a while longer. You’ll also notice the FE variation graphs are different than what we’re using for other lenses. So first, let’s get a good look at what the two new zooms look like on the optical bench.
Remember when we test FE lenses we get some clipped data because of the baffle in the lens so while data from the center to 14mm is an average of 8 measurements for each lens, the data at 18 and 20mm is generally 4 measurements per lens. It sometimes makes the curves look a bit odd at the edge, and we encourage you to not get overly analytical about the 20mm data.
We have some nice comparison lenses to use here: the Canon 16-35mm f/4 IS and the Nikon 16-35mm f/4 VR. Below are the MTF curves for each at 16mm, 24mm, and 35mm. As usual, Zach has done his magic so you can enlarge the graphs to see them better.
At 16mm, all 3 lenses are good in the center, with the Sony being better than the other two at higher frequencies. The Nikon develops ‘astigmatism-like’ patterns fairly close to center while the Canon and Sony hold together well till about halfway to the edge of the image. In the corners the Sony, while quite astigmatic, maintains sharpness quite well.
At 24mm things repeat somewhat, although here the Nikon is doing better at this focal length and is in some ways better than the Canon here. There is an odd ‘bounce-back’ in the Canon MTF at 20mm. It looks like an artifact, but it’s repeatable and consistent copy-to-copy, so I really don’t know that we can accept those 20mm (edge) readings at face value; the MTF graph may be a bit better than the real world. But again, the Sony appears to be slightly the best of the three.
At 35mm, the Canon is at it’s best. The Nikon and Sony aren’t quite as good here, although I want to be clear, these are all really good performances for wide zoom lenses.
Overall we’d have to say the Sony has the best MTF at the 16mm, particularly at higher frequencies, while the Canon is a bit better at 35mm. The Nikon is at its best at 16mm but doesn’t hold up quite as well at longer zoom ranges. The Sony, as all Sony shooters have come to expect, is more expensive than the other two, although not hugely more. In this case, though, it’s at least as good, and in some areas better, than the others. It’s a really good lens.
I like to tell you my expectations pre-testing. In this case, I thought that wide-angle zooms tend to vary a lot, and Sony FE lenses tend to vary a lot, so I thought the Sony FE 16-35 f/4 would, duh, have a lot of copy-to-copy variation.
Before we get to the graphs, let me repeat that these are different than what we’ve used in the past. In the old graphs, we doubled the Y axis so we could show all 5 frequencies we measured. Now we’re leaving the Y axis from 1 to 1, just like the MTF graph and showing you 3 frequencies: 10, 30, and 50 line pairs/ mm. We think this gives a more intuitive picture of what the MTF of a given copy should look like. The lines in the center are MTF lines and should look the same as the MTF graphs, while the colored area is the range we expect typical copies to fall in.
Again, I want to emphasize we’ve tweaked the formula for our variation curves a bit so you’ll find some slight differences between the variation curves for the Canon and Nikon 16-35 f/4 lenses and our previously published curves. The lenses are the same, our math and graphing are a little different. (Again, you can learn how to read MTF charts here as well as Variance Charts here)
We know from previous testing that variation for the Canon and Nikon 16-35 f/4s are pretty good as zoom lenses go; not great, but good. The Sony 16-35 f/4 OSS has very similar to a bit less variation at the wide end and very similar to a bit worse variation on the long end. Overall, both from an MTF and a copy-to-copy variation standpoint, the Sony FE lens is as good as, and sometimes better than, the Canon and Nikon offerings.
The 24-70 f/4 has been a very difficult lens for us to test for several reasons. One of the major reasons was distortion. The distortion figures we got testing it on the optical bench were quite different than what is reported for it. I don’t know exactly why, and I don’t intend to speculate here. But most importantly, there was some copy-to-copy variation in distortion which is rather unusual. We finally ended up testing by running a distortion curve for each copy at each focal length, testing the MTF, then repeating. It made the testing really lengthy and drawn out.
I mention it to be clear that testing this lens irritated me and I am therefore predisposed not to like it. I also point out that because of all this I am a bit uncertain with our testing methods with this lens. We’ve repeated measurements numerous times and I’m comfortable that the results are reproducible. But whenever something is weird and I don’t quite understand why, I’m always hesitant about publishing the results. But people keep asking, so here you go. Just keep in mind that there’s something about this lens I don’t understand so we may be missing something.
The logical comparison for the Sony 24-70 f/4 is the Canon 24-70 f/4 IS L lens. Unfortunately, I only have new-method results for the Canon at 24mm and 70mm, so you’ll have to cope with not having a 50mm comparison. But here’s the MTF curves for the Sony at three focal lengths, compared to the Canon at two.
As you can see, the Canon is clearly better at the 70mm end, and better at the 24mm end. The Sony, however, is at it’s best at 50mm in the center of its zoom range, especially off axis where it maintains sharpness with very little astigmatism all the way to the edge of the field.
We’ll start with the variance curves for the Sony FE 24-70mm f/4 at the three different focal lengths we tested. It really is pretty good at 24mm and 50mm, but there is a bit of worrisome center variation at 70mm. (Center variation is associated with an overall sharpness difference, where off-axis variance often indicates more of a tilt.) For a zoom, though, this isn’t a bad performance, as we’ll see below.
The Canon 24-70mm f/4 IS again only has curves at 24mm and 70mm. As you can see in the graphs below, off-axis it has more variance than the Sony, but doesn’t exhibit the center variance at 70mm that the Sony did.
So the bottom line is while there is definitely some copy-to-copy variation among the Sony f/4 zooms, it doesn’t really appear worse than most zoom lenses.
I will say I’m pleasantly surprised. The Sony FE 24-70mm f/4, while not a great lens is adequate (note I didn’t say adequate for the price, I said adequate) and its copy-to-copy variation isn’t bad. The Sony FE 16-35mm f/4 actually is excellent for an ultra-wide zoom, and again, it seems to have decent sample variation. Would I buy them? Probably not the 24-70 f/4 unless I had no options, but I wouldn’t hesitate on the 16-35mm f/4.
I realize we’ve put out our Sony FE test results in bits and pieces, working through a lot of problems and figuring things out. That’s what this blog is about: trying to figure things out and showing them to you as we go along. But I also realize a lot of you want that information without having to Google 6 different articles to find the test results you’re looking for. So in the next week or so I’ll put out a summary article just showing all the FE MTF and variation charts in one place.