Here's the One iPhone XS Feature That Justifies an Upgrade

It's hard to sell people on a new iPhone every year. The fact is, most people don't buy a new iPhone every year, but the tech journalists who write about any new product will invariably compare it with the previous-generation model, when the truth is that most people who buy an iPhone XS will be upgrading from the iPhone 6, 6S or 7, not last year's iPhone 8 and X.

Sometimes the leaps forward are easy to see: With Face ID and its full-screen design, last year's iPhone X was one such leap. But this year's models — the iPhone XS and iPhone XS Max are available now while the iPhone XR follows next month — are a bit more iterative. So Apple has to focus in on a few specific improvements to continue maintaining the important perception that the entire platform is moving forward in exciting ways.

While the A12 Bionic processor powering the iPhone XS is certainly impressive, just saying a phone is a bit faster is not going to make people too excited. So Apple has become very good at highlighting its prowess in making chips and writing software to drive those chips through the lens of new features. And what better features to improve than those around perhaps the single most important smartphone feature, the camera?

MORE: iPhone XS Max vs. iPhone 7 Plus Camera Face-Off

I've been an iPhone X user for the past year, and the one feature that would make me pay to upgrade to the XS is the improved camera. The camera on the iPhone XS is a major step forward for Apple. But don't be fooled: The feature that Apple spends the most time on isn't the killer feature.

The power of fake bokeh

Ever since the introduction of the iPhone 7 Plus two years ago, Apple has ballyhooed Portrait Mode, a feature that blurs the background of an image to create the effect of shooting a picture through a long camera lens. The feature works by pairing depth information (either via the parallax between two rear camera lenses or, on the iPhone X and XS' front-facing cameras, an infrared sensor) with machine-learning techniques to apply a blur effect to items that are in the background.

It's important to point out that this effect — whether it's on the two-camera iPhone models or even a one-camera model like the Google Pixel 2 and the iPhone XR — is fake. There's no actual long camera lens doing the blurring, it's software faking the effect of a long lens on specific portions of a single image.

MORE: iPhone XS Max and iPhone XS Review: The Max Steals the Show

When it's done well, it looks spectacular. Over the last year I took some great-looking depth shots with my iPhone X, and this past week I've taken a bunch more with the iPhone XS. But if the conditions aren't quite right, or you're very carefully looking at all the edges where blurring begins, you can usually find areas that are handled wrong by the algorithm.

The iPhone XS ups the ante for these shots by allowing you to adjust the depth of field after the fact, which in essence expands or reduces the blur. What I like about this feature is that if you have an image that has a flaw, you can dial back the blur effect and still end up with a good-looking shot. Conversely, if the shot ended up being blurred correctly, you can crank that blur up as high as you like. (It does, however, make it very easy to spot failings of the algorithm, because when you crank the blur up all the way, the mistakes are really easy to see.)

So portrait mode is a feature that's easy to explain and beautiful to look at. But it's not the best feature of the iPhone XS.

Looking at the sun

The best feature of the new iPhones is something Apple calls Smart HDR, which isn't a terrible marketing name, except for the fact that most people don't understand what dynamic range is and why it matters.

In short, the iPhone XS is the best iPhone by far at approximating what you'd see with your own eyes. And most of the time, isn't the ultimate goal of photography to capture something we've seen and keep it forever?

Our eyes — with the help of some powerful post-optics processing by a specialized neural engine called the brain — are remarkably good at seeing items in dark shadow and bright sunlight. Cameras are terrible at it. Shoot a scene with a lot of contrast between light and shadow and you'll either lose all the detail of what's in the shadows, or you'll lose all the detail of what's brightly lit. That's the very definition of a low dynamic range — the inability to see from the darkest darks to the brightest brights, with clarity.

Like our own visual system, the iPhone XS is combining its optics and its processing acumen to piece together images that can cover that dynamic range far better than any lens and sensor can. In what was my favorite part of Apple's presentation rolling out the iPhone XS, Apple marketing head Phil Schiller explained that the iPhone XS shoots four standard-exposure images every time you press the shutter button, and between those images (all captured in a fraction of a second), it also shoots another set of images at a different exposure, so it can cover both bright and dark areas.

MORE: Galaxy Note 9 vs. iPhone X vs. Pixel 2 XL: Which Camera Wins?

Simultaneously, the camera shoots a ninth, long exposure, so that it can gather more light for even more detail in the shadows. Then the A12 processor analyzes all those images and combines them into a single frame — one that can show off brightly lit objects and items that dwell in shadow.

The result is staggering. On the iPhone XS, you can shoot directly into the sun and, other than a little lens flare, you will get usable shots. I took my phone on an afternoon walk on a forest trail and was able to shoot items in shadow while all the brightly lit items also remained in view. I was able to shoot into shadow so deep that I couldn't see anything with my own eyes, and the iPhone camera could pick out images.

The best part is, since Smart HDR is turned on by default, regular iPhone users don't need to do anything or know anything about the feature. The net result will be the same: the photos they take will look better, and more like the image they have in their mind's eye.

Added dynamic range for video, too

A feature promoted less by Apple is the extended dynamic range available for video shot with the iPhone XS, provided you're shooting at 30 frames per second or lower. In 30fps 4K video mode, the iPhone XS doesn't just take a frame every 30th of a second. Instead, it takes alternating frames every 60th of a second, alternating between bright and dark exposures, and then the A12 processor analyzes each frame pair and intelligently combines them to expand the dynamic range of the video.

It's not quite HDR, but it means videos shot with the iPhone XS will be much more capable of handling scenes that mix bright and dark elements.

Video in general gets a major upgrade on the iPhone XS, from stereo audio recording to improved image stabilization that uses overscan pixels on the iPhone's image sensor to make your video look as if it were shot with a fancy gimbal instead of with your shaky human hands.

It's hard to appreciate new features of an iPhone sometimes. Not everyone cares about faster processors. But if those faster processors can unlock dramatically better photos and video? Now that's a feature boost anyone can appreciate.

Credit: Tom's Guide

Jason Snell was lead editor of Macworld for more than a decade and still contributes a weekly column there. He's currently running the Six Colors blog, which covers all of Apple's doings, and he's the creative force behind The Incomparable, a weekly pop culture podcast and network of related shows.