The Galaxy Note 9 has arrived, and with it comes the cynicism that this is just another minor update for Samsung’s most exciting phone. While it’s true that the Note 9 recycles some aspects of its predecessor — its exterior design being one of the most notable examples — Samsung made plenty of changes beneath the surface, including to the Note 9’s dual 12-megapixel cameras.
As with the similar design, it may seem that the Note 9’s shooters haven’t changed from what Samsung gave us earlier this year in the Galaxy S9+. After all, Samsung’s flagships tend to share similar camera tech.
However, the company tends to save something special for the Note brand. With last year’s Galaxy Note 8, it was a second lens at the back. This year, it’s artificial intelligence that figures to change the way you capture the world around you.
MORE: Best Smartphone Cameras
Here’s a closer look at the major changes introduced with the Galaxy Note 9’s cameras.
By now, you’re probably used to phone makers publicizing the fact that they’ve used AI to improve image processing. Huawei and LG have done it, making it a headline feature within everything from the Mate 10 Pro to the G7 ThinQ. Google also uses to improve HDR processing, though the company been more discreet about it. The fact is, you need AI to get ahead in the photography game today.
Samsung knows this, too, which is why its gifted the Note 9’s lenses with the brains to identify scenes and tune the shot to fit each scenario.
How many scenarios? Samsung says the Note 9 can suss out 20 different types of scenes: Food, portraits, flowers, indoor environments, animals, landscapes, greenery, trees, sky, mountains, beaches, sunrises and sunsets, watersides, street shots, night shots, waterfalls, snow, birds, strong backlighting and text.
Once the phone has figured out what it’s looking at, that’s when the magic happens. Samsung’s software actively tunes aspects like exposure, white balance, contrast and brightness to make sure the shot matches the ideal for that scene.
While more hands-on time with the Note 9 will give us a definitive sense of what Scene Optimizer can produce, it’s an encouraging prospect. Samsung already made considerable strides in low-light performance with the Galaxy S9, thanks in part to its variable-aperture shutter system, and we’re excited to see how those shots could be even further improved through the use of some clever AI.
Still, if there are times when you’d rather not use the feature, Samsung gives you the option of turning off the Scene Optimizer.
Better processing through AI is one thing, but all of the tools in the world won’t prevent mistakes. That’s why the Note 9’s other new photography software feature, Flaw Detection, could be really useful.
Flaw Detection does what it says on the tin — it calls out issues with photos you take, and suggests ways to get around them. This happens right after you’ve pressed the shutter, so you’re not bombarded with on-screen complaints in the viewfinder.
At first glance, Flaw Detection might sound at best unnecessary or at worst, annoying. But consider a real-world situation where you don’t have all the time in the world to snag that perfect shot. Phone displays have come a phenomenal way over the years in terms of size and quality, but they’re still pretty small in the grand scheme of things. When you’re quickly glancing at a picture you’ve just snapped on a relatively tiny screen, there’s a lot of faults you could miss. In fact, you might only notice problems (like a slight degree of hand shake, for example) once you’ve taken a second look at your image blown up on a monitor.
Those are the situations when it’d be really handy to have Flaw Detection. The feature notices if someone in the shot has blinked, or if the lens may have been smudged. It will even request that you move to a different location to snap another attempt if the backlighting is too strong. And just as with Scene Optimizer, if you find Flaw Detection more frustrating than helpful, you can always turn it off.
Those are the two significant improvements to the cameras in the Galaxy Note 9. In terms of hardware, the equipment is very similar to what you’d find on the Galaxy S9+. Both rear sensors are still rated at 12 megapixels, with variable aperture (f/1.5 and f/2.4) on the main shooter and optical image stabilization on both.
The telephoto lens on the Note 9 can still function as a 2x optical zoom lens or enable Live Focus portraits, and super slow-motion video at 960 frames-per-second is once again possible — capable of capturing 0.2 seconds of real-time footage lengthened to 6 seconds in playback.
The Note 9’s front camera is rated at 8 megapixels, also like on both Galaxy S9 models, with an f/1.7 aperture.
One noteworthy camera change on the Note 9 doesn’t involve the camera at all. Instead, it comes from the S Pen, which now has Bluetooth connectivity. That essentially turns the stylus into a remote control, which is very helpful for the phone’s camera. Press the button once and the camera app launches; another press can operate the shutter, which we think will prove to be handy when it’s time to take a selfie.
The Note 9 sounds like a formidable mobile camera, though we won’t really know how good it is until we’ve put it through its paces. Check back in the coming days for an in-depth camera faceoff between it and other flagship shooters, like the Pixel 2 XL and iPhone X.