Skip to main content

iPhone 11 Deep Fusion Camera Tested: Here's the Results

(Image credit: Tom's Guide)

The iPhone 11's Deep Fusion is almost here. And if you're part of Apple's iOS 13 beta program, you can try out the iPhone 11 photography feature right now as part of the iOS 13.2 public beta.

We didn't want to wait to see just how much detail and life Deep Fusion adds to medium-light images. So we installed the iOS 13.2 beta on our iPhone 11 Pro, and went to a somewhat dim corner of the Tom's Guide office to take a few shots and compare them against similar ones captured on an iPhone 11 Pro Max running the latest public release of iOS 13, without Deep Fusion.

If you want a more granular explanation of how the feature works, you can check out our handy explainer on Deep Fusion. But in layman's terms, it's basically a supercharged version of Apple's existing Smart HDR technology, that is exclusive to the iPhone 11 series.

Like Smart HDR, Deep Fusion requires the iPhone to capture many exposures in quick succession. Yet Deep Fusion is more sophisticated, because it sifts through each exposure, pixel by pixel, to produce one optimized result that blends the best of every frame. It's also designed specifically for scenes between those extremes of light and dark, when either Smart HDR or Night Mode wouldn't suffice.

So exactly how good is Deep Fusion, and what does it add to the iPhone 11's already excellent photography? While it's important to remember that this feature is still in beta, we're encouraged by the results thus far — even if it doesn't appear to be the revolution in mobile photography Apple touted in its Sept. 10 iPhone 11 launch.

Take, for example, the photos of my colleague Rami Tabari you see above. We had to crop 100% into both images to highlight the granularity Deep Fusion added to the inside of Rami's hoodie and red shirt. His skin and facial hair are sharper as well, while the highlights striking his face from the studio lamps appear brighter.

The thing is, I'm not sure I would have really noticed the differences if I didn't pinch to zoom in very closely. That's not to say Deep Fusion isn't a great asset to have — far from it. It's just that you have to know where to look to understand what it's doing and why it's important.

We know Apple loves its sweaters — it is getting to be sweater weather after all — so Tom's Guide's Henry T. Casey donned his favorite composition book-themed knitwear to pose for a second test.

The strong contrast between the white and black fabric makes it difficult to determine what detail, if any, Deep Fusion is lending to Henry's sweater here, even at 100% zoom. (The take with Deep Fusion on looks a tad sharper to my eye, though that could very well be the placebo effect at work.)

However, when you look at his face instead, the difference is actually quite stark. Pay attention to the start of Henry's hairline near his temples — Deep Fusion attacks this region with more precision than the iPhone 11 Pro would otherwise be able to. As in the last example, the same is true for the subject's skin texture.

Finally, I asked Henry to change into a different sweater — he came to the office prepared this morning with three of them, bless his soul — for one final comparison. Opting for a 66% crop exclusively on the fabric, it's clear to see the iPhone 11 Pro Max without Deep Fusion exposes everything a bit more brightly. At the same time, the knitted patterns and designs are rendered more crisply when Apple's new tech is applied, and the tie-dyed blue of his t-shirt collar is more vivid.

We won't truly grasp the full potential of Deep Fusion until we've exposed it to more scenarios, and until Apple launches a final version of it in a stable iOS 13 release. That said, we've seen enough in this first look to get a handle on why Deep Fusion matters, and the kinds of scenarios in which it will be most useful. In this case, it's not about the big picture — this update's about the little things.