We've come to expect every new generation of a flagship smartphone to only add features, not take them away. However Samsung may have different ideas for the Galaxy S21's camera system when that phone comes out next year.
A report from South Korea's The Elec (opens in new tab), by way of SamMobile (opens in new tab), states that Samsung has decided to ditch time-of-flight sensors in its upcoming Galaxy S21 series (also rumored to be named the Galaxy S30).
- The best camera phones on sale now
- Best Samsung phones
- Plus: I just tried the Galaxy Note 20 Ultra's coolest feature — and it's a big step forward
It's said there's two reasons behind the move: first, the company is struggling to find obvious use cases for its time-of-flight technology; and second, the LiDAR system expected to appear in Apple's iPhone 12 Pro models will be more powerful, and Samsung doesn't have faith its approach can compete.
According to The Elec, Samsung is hard at work on a new-and-improved indirect time-of-flight system that isn't like LiDAR, but rather builds off of the hardware already present in its devices. Unfortunately, that solution isn't expected to be ready in time for the Galaxy S21's launch in the spring of 2021, and so it's quite possible time-of-flight could be gone from Samsung's most popular models. Neither the Galaxy Note 20 Ultra nor the Galaxy Note 20 shipped with such as sensor. (The Note 20 Ultra does have a laser autofocus sensor, though.)
The problem is one of distance and accuracy. With LiDAR, the iPhone 12 Pro will be able to detect objects in physical space at a distance twice as great as that of conventional indirect time-of-flight sensors. The LiDAR method also produces a more detailed 3D depth map than ordinary time-of-flight, making augmented reality applications smoother, more lifelike and more accurate in the context of the surrounding environment.
On the other hand, indirect time-of-flight sensors like those in Samsung and LG's devices are cheaper to produce, which is why they've become so common in high-end Android devices.
In terms of photography, we've had the opportunity to test a number of handsets over the years with time-of-flight cameras, and never particularly found that they aided image quality in any appreciable way. Typically, phones with time-of-flight sensors will use the added depth awareness to build a 3D map that can more intelligently separate the foreground from the background in shallow depth-of-field shots with simulated bokeh.
However, often times you can get the same result with the stereoscopic vision of two distinct camera lenses, without the need to add time-of-flight to the mix. Additionally, software alone based on machine-learning models has rapidly improved over the last few years, to the point where single-lens devices like the iPhone SE or Google Pixel 4a can produce depth-of-field effects nearly on par with those of pricier, multi-lens flagships.
All this is to say that time-of-flight has never been particularly useful in flagship phones — at least in its current iteration — and always came across as more of a gimmick that hasn't quite lived up to phone makers' promises. It's a bit head-scratching that Samsung evidently believed in time-of-flight for as long as it did before finally canning it, based on this report.
Perhaps LiDAR can succeed where previous time-of-flight attempts failed. As of now, Apple is the only smartphone brand linked to embedding LiDAR technology in its handsets. Cupertino already has experience with the tech, having introduced LiDAR in the latest iPad Pro. If LiDAR truly benefits the iPhone 12 experience, expect Apple's rivals to take note and work tirelessly to catch up — sort of like they did when Face ID proved a hit.