The Pixel 2's 12-megapixel camera is already one of the best shooters you'll find in a smartphone — at least when it comes to lone rear lenses. But anyone who buys Google's new phones could see even bigger improvements to the camera soon.
In a blog post yesterday (Oct. 17), Google detailed plans to use the Pixel Visual Core, one of the Pixel 2's new co-processors specifically designed for image processing, to produce better HDR photos.
The Pixel Visual Core already exists inside every Pixel 2 and Pixel 2 XL, but it wasn't switched on at launch. It's the first custom co-processor Google has ever designed in-house for consumer use, and it mostly assists processing images taken in the camera's HDR+ mode. Although it's not currently active for users, Google says it will roll out in the next developer preview of Android Oreo, version 8.1, set to arrive in the coming weeks.
Later on, the Visual Core will be available to all third-party apps using the Android Camera API — meaning even if you're using Instagram or Snapchat, your photos will look just as sharp.
So what exactly can the Visual Core do? The examples provided by Google show a marked difference between images captured in third-party apps without access to the Visual Core, and the way they'll look going forward. The system returns color previously lost in the shadows, while reigning in the highlights for a much more balanced exposure.
You likely won't see the same kind of improvement compared to the existing stock camera app, but the Visual Core figures to be a step forward no matter how you use your Pixel 2's camera. And, like all of Google's work these days, the Visual Core operates on the basis of machine learning, so you can expect it to keep improving over time.