Lenses and Optics

Sigma 24-70mm f/2.8 DG OS HSM Art Sharpness Tests

Sigma has been on an incredible run these last 5 years, releasing one amazingly sharp lens after another. They’ve made lenses no one has ever tried before and not only succeeded, they made them amazingly good on the first try. Their quality control has become as good as anyone’s, better than most. And their repair service has become one of the best out there.

Like many of you, we’ve waited for the Sigma 24-70 f/2.8 Art lens for quite a while. It would have image stabilization, it would be less expensive than the brand name alternatives, and it would be sharp as heck, because it was a Sigma Art.

I’ll save those of you who hate to read the trouble of reading. Even Babe Ruth hit singles sometimes. It had to happen. Sigma has made lens after lens that exceeded everyone’s wildest expectations. Sooner or later they were going to make one that didn’t. This isn’t a bad lens, but we’ve come to expect amazing things from Sigma Art lenses and this lens is not amazing.

As always, these are the results of 10 tested copies; each tested at four rotations.  For those who don’t speak MTF, the easy version is higher is better, and dotted and solid lines of the same color close together are better. And as always this is an MTF test, not a review. I’m still not sure I can pronounce bokeh, much less describe it to you.

MTF Results

We’ll look at the results at 3 focal lengths; 24mm, 50mm, and 70mm. We expect most 24-70mm zooms to perform best at 24mm and be weakest at 70mm. The Sigma is actually a bit different, having its best performance at 50mm.

24mm

One thing to note at 24mm is the bulge of astigmatism-like separation in the middle of the field, from 4mm to 12mm or so off-axis. I’m not sure what this will look like in photographs, but it might be, well, different. Or maybe not noticeable. I’ll be interested to see.

Olaf Optical Testing, 2017

50mm

Things sharpen up nicely and the curves become much smoother and regular. I expect 50mm is not only the sharpest zoom range, but probably has the best out-of-focus appearance, too.

Olaf Optical Testing, 2017

70mm

Resolution drops off at 70mm, but the curves stay smooth good away from center.

Olaf Optical Testing, 2017

Copy-to-Copy Variation

I can’t say this was great, honestly. At 24mm we have a nice, tight range but things get a bit random at both 50mm and 70mm. Overall I’d call this better than average at 24mm and a little below average at 50mm and 70mm.

24mm

Olaf Optical Testing, 2017

50mm

Olaf Optical Testing, 2017

70mm

Olaf Optical Testing, 2017

Field of Focus Curvature

Please don’t mistake this for distortion measurements, which someone did a couple of weeks ago.

24mm

There’s a gentle curve at 24mm in the sagittal field, with the tangential field curving more severely.

Olaf Optical Testing, 2017

 

50mm

At 50mm the sagittal field is perfectly flat with the tangential field reversing curvature into a mild mustache pattern.

Olaf Optical Testing, 2017

 

70mm

At 70mm the sagittal field remains flat. The tangential field, well, we had the expectations setting a little high on our bench and the curve really didn’t resolve well enough for us to clearly tell about the tangential field. Maybe a mustache. Maybe who cares.

Olaf Optical Testing, 2017

 

Comparisons

Well, the charts are nice and all, but it’s always good to have comparisons. I’ve carefully selected the ones I think are appropriate and avoided the ones you wanted to see. It’s not that I’m purposely cruel, wait, yes it is.

Sigma 24-70mm f/2.8 Art vs Canon 24-70mm f/2.8 Mk II

The Canon 24-70 f/2.8 L Mk II is about as good as it gets for zooms in this range.

24mm

The Canon is at its best at 24mm and the Sigma gets pretty beaten up here.

Olaf Optical Testing, 2017

50mm

At 50mm the story is a little different. The Sigma is at its best at 50mm and the Canon has dropped off a bit. In the center things are completely even. The Canon is just a little bit better in the middle of the field. So if you want to compare your new Sigma Art to your buddies Canon, try to do it at 50mm.

Olaf Optical Testing, 2017

70mm

Both lenses have fallen off a bit at 70mm. The Canon is a little better here but not as dramatically better as it was at 24mm.

Olaf Optical Testing, 2017

 

Sigma 24-70mm f/2.8 Art vs Tamron 24-70mm Di VC

This is probably a more reasonable comparison; the two image-stabilized third-party zooms. The Tamron G2 version will be out soon and is expected to be better, but we don’t have MTF tests on it. Because soon is not the same as now.

24mm

At 24mm, the Tamron is clearly a bit better.

Olaf Optical Testing, 2017

50mm

The Sigma again shows it is at its best at 50mm, and particularly away from center, it is a little better than the Tamron.

Olaf Optical Testing, 2017

70mm

At 70mm the Sigma is better than the Tamron, which is clearly weakest at 70mm.

Olaf Optical Testing, 2017

 

Sigma 24-70mm f/2.8 Art vs Nikon AF-S 24-70mm f/2.8  ED VR 

Nikon has a different emphasis in their 24-70, giving up some center sharpness in exchange for good sharpness across the entire field.

24mm

The pattern is familiar, at 24mm the Sigma just isn’t as good.

Olaf Optical Testing, 2017

50mm

AT 50mm, though, the Sigma is clearly sharper in most of the frame. This is the weak focal length for the Nikon and the strongest range for the Sigma. In the outer 1/3, though, the Nikon is a little sharper.

Olaf Optical Testing, 2017

70mm

At 70mm the Sigma has better sharpness at the higher frequencies, the Nikon is a smoother away from center.

Olaf Optical Testing, 2017

 

Conclusion

I’ll admit I’ve been a bit of a Sigma Fanboy lately. The only thing better than aggressively trying new things is aggressively trying new things and making them awesome and that’s what Sigma has been doing. But I’m not a big fan of this lens. This is an adequate lens, but nothing more than that.

I’d probably feel better about it if it didn’t have ‘Art’ on the label. I’ve come to recognize Sigma Art to mean ‘as good or better than any other lens in that focal length, even when the others cost way more.’ This lens I would describe as adequate overall. It’s weak at 24mm and good (but not awesome) at the longer parts of the zoom range.

If it didn’t say Art on the side and cost a few hundred dollars less, I’d probably be less disappointed. If I was being snarky, I’d say they left the “F” off of Art on this one. But I’m trying to be less snarky these days so I won’t say that. Or at least won’t say it again.

The Sigma 24-70mm f/2.8 Art Series‘ better performance at the long end may appeal to people that already have 24mm covered with a good wide-angle lens. If you use your 24-70 f/2.8 mostly at 50 and 70mm then the weakness at 24mm may not bother you much.

I think most people considering this lens are going to wait to evaluate the Tamron G2. If the Sigma price falls significantly it may be a more attractive option, but right now I can’t see a strong reason to make it your 24-70 choice. It’s not a bad lens, just not an Art lens, really.

 

Roger Cicala and Aaron Closz, with the invaluable assistance of hard-working intern Anthony Young

Lensrentals.com

July, 2017

 

Addendum: As requested, comparison to the Tokina 24-70 f/2.8

24mm

Olaf Optical Testing, 2017

50mm

Olaf Optical Testing, 2017

 

70mm

Olaf Optical testing, 2017

Author: Roger Cicala

I’m Roger and I am the founder of Lensrentals.com. Hailed as one of the optic nerds here, I enjoy shooting collimated light through 30X microscope objectives in my spare time. When I do take real pictures I like using something different: a Medium format, or Pentax K1, or a Sony RX1R.

Posted in Lenses and Optics
  • Gabriel

    Who cares…most people by a 2.8 to shoot at 2.8 not to freakin stop down!

  • 24×36

    Since they screwed up the ergonomics (on this and the last version), making the zoom ring turn in the opposite direction compared with all of my other Nikon mount zooms (ironically all Sigmas), I’m actually kind of glad it’s a dud. I wouldn’t have bought it anyway, and now I don’t feel like I’m missing anything. ;-D

    So in my own twisted logic kind of way, thanks for the good news, Roger! ;-D

  • David Alexander

    The 24/28-70/2.8 seems to be Sigma’s achilles heel. None of them (and there have been, what? At least a half-dozen over the years?) have been best-of-breed. It’s curious.

  • Chuck Seybert

    Thanks Doc,
    Keep up the good work, Looking forward to the Tamron 24-70 2.8 g2 tests.

  • Sigma does extensive testing, has developed a target based analysis based on their own equipment and at least owns MTF benches. (I have no knowledge on how they use them.) They also have started putting out some real-world MTF tests, as well as the computer generated ones. They’re more aggressively testing than probably anyone else.

    Zeiss does actual MTF testing on their own benches, but the K8-K9 are a bit long in the tooth and have some limitations. When I last discussed it with them they only did an single cut across the lens, so they assume rotational symmetry.

    Canon has probably the most advanced testing on-axis during assembly, but they don’t test off-axis very enthusiastically and tend to believe the easy (but very inaccurate) theory that if it’s good on-axis it’s good.

    But mostly companies don’t want to spend money on metrology unless forced to. It’s expensive. And if it finds flaws, and that’s expensive. They do it when they absolutely have to if the company is run by accountants, or sometimes because they want to, if the company is run by engineers. That’s a generalization, obviously, but a pretty accurate one.

  • JCT – these were copies bought from camera stores. New in box, readily obtainable, nothing special.

  • Xrqp

    You greatly (excessively?) generalize. Not helpful compared to OLAF. You never know at what f stop the MTF could jump.

  • Xrqp

    The MTF graph that Tamron shows is identical to previous lens. Is it possible the coating affect more the transmission and not so much the MTF?

  • Xrqp

    The OLAF data shows the Nikon lens is better than the Sigma overall. Maybe not enough to justify $1100 more.
    I guess you could say Sigma is better if you severely discount the outer area of the image. But the outer 1/3 by lineal distance, is 55% of the image area. The outer 1/4 is 44% of the area.

  • Brandon Dube

    I do not see the connection between accidents and the ability of our race to model things.

  • JCT

    Nor did Apollo 13 almost cost the lives of three astronauts or the Hubble Space telescope become flawed when all the specs said they shouldn’t. Stuff happens..

  • Brandon Dube

    Ok… but all situations can be characterized and parameterized in the same way as a test or simulation. We did not get to the moon because we threw up our hands and said we couldn’t model things.

  • JCT

    Brandon.. I’m going to just have to believe my thoughts just as I’m sure you believe yours. NO modeling can ever account for EVERY given situation or combination of equipment, lighting, user knowledge/choices (or error). It can “predict” what it will do, but only within the parameters set forth by the testing equipment and environment of the tests. Any scientific study is always bound to that construct. Again, my opinion.. I appreciate and respect yours.

  • Brandon Dube

    Those 4000 are just the ones we’ve tested rigorously. Lensrentals (i.e. not Olaf, with test charts not an MTF bench) has probably crossed 100,000 lenses doing that by now.

    There’s nothing to tweak in optical manufacturing once you begin volume production. The traditional way carries too much cost and risk to change, and CNC is always best effort per-piece and there are no batches.

    The mechanics could change, but the ones relevant to the optics never have. Just PCBs moving around or being replaced or upgraded, different seals, etc.

    “bad runs” and serial number ranges either do not exist, or are so uncommon they are not worth discussing. We would have seen it in our results by now, and internally we track far more than we show on the blog.

  • Brandon Dube

    Ah, but your claim is false. Bench testing can predict, “totally,” how a lens will perform in a given user’s hands in and varying light they either create or are given by Mother Nature. This is covered in a number of ISO standards, 9334, 9335, 9358, and 11421 for MTF and stray light.

    To believe otherwise is to believe that the optics and opto-electronics industry/industries are filled with people incapable of modeling and understanding imaging which is simply untrue.

    If your principal concern for a lens is its contrast and resolution at full aperture (or other apertures we show data for), then these charts can be a decision maker. If you care about stray light, color, or other imaging properties it would be ridiculous to expect a resolution chart to tell you about those things.

  • JCT

    I appreciate that effort, but then that doesn’t account for how bad copies (and we know that there are those) within, at times, certain serial number ranges aren’t as good as with earlier or later versions of the same lens. I have no where near the experience level or detailed knowledge you have, but I also believe, given the chance, that “tweaks” can and are likely made between production runs.. just an opinion.. Your own methodology of testing augments that. You average the results of 10 lens’.. If there were no optical difference, just test one..

  • JCT

    I think we’re going to have to agree to disagree Brandon. My customers don’t buy images because I tell them the MTF data said the lens was performing up to spec. They buy images based on every single thing that I, as a photographer, create with the equipment on hand. A lens is only one component, an important one (and I’ve conceded it should perform up to specs), but just as no two photographers are the same (nor their vision), no two scenes are likely ever the same (unless we’re talking somewhat static studio lighting – but even there things like skin tone, etc. can vary greatly).

    For me, the value of this, or any lens, doesn’t come from a data chart. It comes from what I see in the results, how I feel when using it, how it holds up when I absolutely HAVE to have it perform, etc. The MTF charts are only one piece, and again for me, static bench lighting cannot predict “totally” how a lens will perform in a given user’s hands in and varying light they either create or are given by Mother Nature.

    Again, I think your efforts are important, but not a decision maker – which unfortunately some use these charts as.

  • Brandon Dube

    We’ve tested over 4,000 lenses, ranging from the very first ones off the production line to ones built 15 years after a lens was released. We have never detected an optical difference between production runs. Slipping a month or two on release schedule is not very long, I doubt they made any significant changes to the design, it is far more likely they were waiting for shop floor time while a different lens was being produced. Most lenses are not produced continuously, and I would imagine the 14/1.8 and new cine lenses were having stock built up.

  • Brandon Dube

    Olaf the instrument does not measure MTF. It could with some modifications, but those would be detrimental to its use as an alignment station, which is its true purpose. The location of the pinhole(s) in Olaf is also undesirable for lenses with short focal lengths, fast apertures, and good image quality.

    Olaf is also a company, and all we do is optical testing and alignment.

    I have used every commercially available MTF bench on the market, and a number of custom-built ones. Olaf has an ImageMaster with all the bells and whistles, and also a Wells Research bench that now sits on a shelf with all of our other eclipsed equipment (imatest, projection setups, etc). All lens MTF measurement techniques should yield the same results, given the same conditions. In this case, the conditions are spectrum of light, and object distance. The purpose of these quarter million dollar, traceable instruments is that measurement bias and errors are removed.

    The same is also true of slanted-edge or other system MTF test setups, but slanted-edge is much more algorithmically intense than the pinhole or slit techniques used in lens-only MTF benches and there is a lot more room for debate in what is right and wrong.

    I think MTF correlates exactly with “field work,” if you understand what it means. A scene ultimately has some spectrum of spatial frequency content and some spectrum of color content. MTF is exactly the degree to which the contrast any any given spatial frequency is supressed, and if the lighting in the MTF test and the scene are sufficiently similar in color then you could even use the MTF tests to simulate with great accuracy what an image taken of the scene would look like.

  • JCT

    Brandon.. I wasn’t so much challenging that they were “production” level.. but perhaps “what” production level. We’ve often seen “early” production levels differ from “mass final go to market” production levels differ based (perhaps) on final tweaks coming from testers (like yourselves) and others. Clearly the lens was “later” than Sigma originally announced.. Could that have been a tweak to some coating process, perhaps. At this late stage, it would preclude a redesign, but in Sigma’s case could also mean some tweak in software as well.

  • JCT

    Brandon.. To be fair, OLAF though is not the only higher end testing mechanism. There is also K8, ImageMaster, LensCheck, and Wells Research. I’m not suggesting you need to use them all to validate OLAF, just that different systems can (and likely will) produce different results.. How much different, I don’t know, but that would also be an interesting comparison.

    I’m also not ready to “diss” the designs at Canon and Nikon, and now how Sigma and some others are up’ing their game. Should a lens live up to design expectations (with accounting for some variation copy to copy), yes, it should, but again we’re also talking how light on a bench vs. “lighting” in a real world shoot behaves. The two don’t necessarily equate, and to my mind, both only simulate an expectation. Whether design or bench, neither test is more than “here’s how a lens performs given a certain lighting situation (or set of situations)” .. Neither emulate field lighting which I think is in line to your other reply talking to 135L and 14L II.

    For me, the real performance tests will come with varying shots I can create and side by side lens comparisons with the Canon 24 – 70 (V1) I have first against the Sigma, then if necessary, a comparison against (most likely) either a rental Canon 24-70 2.8 L II or a refurbished one from Canon.

    My only “argument” in MTF test results is that opinions are formed (and in some cases decisions made) with never having the lens in your hand… There are other characteristics that also contribute to a lens choice that MTF never addresses.. IS/OS, Build Quality (including weather seals), etc.. and then measure the whole ball of wax against price.

  • Brandon Dube

    Currently, neither Sigma nor any other manufacture does MTF testing with the level of rigor Olaf does. It is expensive and requires expertise they don’t have.

  • Brandon Dube

    Most don’t want to spend the money on metrology. It costs a lot and does not drive profits, which does not make sense to most business people.

  • Brandon Dube

    JCT,

    All the tests we do are for production copies of lenses. We should always be conscious of of expectations. Even at 24mm where the new 24-70A is weakest, on-axis it is very similar in performance to the 135L, a highly respected lens, and the corners are pretty comparable to e.g. the 14L II.

  • Chuck Seybert

    Someone over at DPR, in the third party lens forum just received his lens a couple days ago. Said it was fantastic.

  • JCT

    Chuck.. I am curious why none of the major sellers, B&H, Adorama, etc. have the lens and if some of these others were earlier production runs and not final?? I saw Amazon say they would start shipments on July 29th.

  • Chuck Seybert

    I just returned this lens to Lens Rental after having it for seven days, Really enjoyed it, I found it performed better then my old 24-70 canon v1 and my just sold tamron 24-70 2.8 vc Waiting for the new tamron before deciding which one the purchase.

  • JCT

    Chuck.. I’m sure they do, before, during, and after the final build, which is way some of what Roger did is troublesome given the “Art” lens history. Zoom lens’ very rarely keep pace with their Prime lens counterparts.. There is always some sacrifice made for the convenience of a zoom. Again, downloading the RAW files and processing in LR would indicate a very high performing lens (someone on another forum suggested using the Canon 24-70 MK II profile in LR6 – it works very well).

  • Chuck Seybert

    Hey All
    What I don’t understand is why a company as big as Sigma would not do the same tests as Lens rental and try to improve on things that Roger and Co have found.

  • JCT

    Roger,

    Thanks for the efforts. A couple of questions if I can. I too, like may others, have been spoiled by the ART 1.4 primes.. just great lens’. That makes this review, while I’m sure accurate, difficult to understand, as Sigma has really tried to up the game on all counts. The 150-600 Sport is just great, as least my copy (a beast, but great).

    So Sigma originally planned a June release of these as I understand it and we’re near the end of July.. Mine is on Pre-Order. What I’m wondering, if you’re allowed to comment, is what production level were the 10 you had? Were they ones from “around the block” a few times with different reviewers, etc., or were these ones ready to ship. I’m wondering if the delay has anything to do with Sigma tightening up as much as they can before full mass production. The RAW images I’ve downloaded from various sites show the lens to perform very well at a lot of different focal lengths, f/stops, etc.

    Again, appreciate the efforts.

Follow on Feedly