The Google Pixel 3 is a fantastic camera phone, even though it is one of those rare handsets that has just one lens on the back. The new Night Sight feature confirms this.
Night Sight is a feature that was teased at the Pixel 3 launch but has been one Google has been slow to release. The reason for this isn’t certain. But it’s clear that Google didn’t want the world to think that Night Sight is a function that’s just for its Pixel 3 range, as it will also be available on all of the other Pixel phones.
We were given a pre-release version of the feature (an official one and not a workaround that did the rounds recently) and it’s very impressive.
This selfie was taken in a dark corner, with a little bit of streetlight glare in the background. Normal camera mode picked out some detail but the overall quality is dark and subdued. Turn on Night Sight, and it's like dawn has broken. Colors are bright and while there's detail lost in the face, the effect is bold and bright.
Taking an image of this hedgerow was a real struggle for the normal camera mode on the Google Pixel 3. The picture's palette is muggy and mostly in shadow. When Night Sight was flicked on, it took a little longer to take the shot but there's a real vibrancy there.
This is Google's AI algorithm at work, and it really is jaw dropping just how much light and detailed is added. Just look at the way the feature has picked out the greens and browns of the leaves in the foreground.
We also took the feature for a test run around London, capturing some of the capital’s best-known landmarks. The results we got back were fantastic. While the Google Pixel 3 did a decent job of capturing the night without the feature, switch it on and there’s much more detail in the shots. As you can see from the photo of the building, the sky is visibly lighter, with more cloud detail and the lights in the building sparkle that little bit more.
When taking a shot of the London Eye, the exposure time was only a few seconds but the results are a much punchier red around the rim of the wheel, while the spokes in the middle are far more detailed than with the Night Sight feature on.
The reason for the varying exposure times is actually due to how much handshake there is when you are taking an image.
Before the shutter button is even pressed, Night Sight will measure your natural hand shake, and take into account any other motion in the scene. If the phone is fairly stable then, Night Sight will take more time capturing the light. If there’s a lot of movement, then it will use a shorter exposures and this minimizes motion blur.
In our tests, it was clear that Night Sight offers up a much brighter image. When we went to New Scotland Yard, the headquarters of London’s Metropolitan police, the logo on the sign was enhanced massively by Night Sight.
When shooting on Night Sight, you can actually see how long the exposure time is because there’s a progression wheel when taking images at night. Google believes that you can go as low as 0.3 Lux with Night Sight but to take images like this you will need to use a tripod with your phone. While this is something we have yet to test, the technology really shone for us in our tests.
When taking an image of Westminster Cathedral, it’s apparent in the Night Sight version that there is more reflection in the puddle in the foreground and the buildings to the right of the cathedral have more detail.
This was, however, a fairly well lit shot (even though it was taken at night). Night Sight couldn’t quite combat the lens flare of the street light. and this is something Google notes that having too many bright lights in the shot will lessen the quality of the image.
On the flipside of this, Night Sight isn’t night vision. It hasn’t been created to help you see in the dark, so there needs to be some available light in the image for it to work effectively.
We also occasionally found that Night Sight would struggle a little with its focus. But there is ‘near’ and ‘far’ focus functionality available in the feature. When we toggled this option, the results certainly improved.
To coincide with the launch, Google has released a blog post that details the technology behind Night Sight. In short, the feature works by taking a number of shots to ape a long exposure. It then stitches these together and uses AI to to add in color where needed.
Google is confident that it has created a substantial feature here, explaining: “Night Sight is designed to capture true-to-life photos, adapting to the various lighting conditions you’ll see at night. Using machine learning, Night Sight balances the color of your photo, so that objects show their natural color at night.”
Night Sight is rolling out to all Pixel phones in the coming days.
Credit: Tom's Guide