Android flagships like Samsung's Galaxy S8 also have impressive cameras, but Apple's deep integration of hardware and software makes technical specs less important than how the iOS Camera app functions in reality.
Even former Googler Vic Gundotra, who oversaw Google+, admits that Android phones have fallen behind the iPhone when it comes to photography.
"Here is the problem: It's Android," Gundotra wrote in a Facebook post. "Android is an open source (mostly) operating system that has to be neutral to all parties. This sounds good until you get into the details. Ever wonder why a Samsung phone has a confused and bewildering array of photo options? Should I use the Samsung Camera? Or the Android Camera? Samsung Gallery or Google Photos? It's because when Samsung innovates at the hardware level (like a better camera) they have to convince Google to allow that innovation to be surfaced to other applications via the appropriate API. That can take years.
"Apple doesn't have all these constraints," Gundotra continued. "They innovate in the underlying hardware, and just simply update the software with their latest innovations (like Portrait mode) and ship it. Bottom line: If you truly care about great photography, you own an iPhone. If you don't mind being a few years behind, buy an Android."
Those are bold words, but Apple is setting out to prove they're true if the camera-specific code we're seeing in iOS 11 is any indication.
Get Ready for 'SmartCamera'
Apple’s HomePod is spilling all of the company’s secrets about the upcoming iPhone. The firmware code Apple released for its upcoming speaker is full of Easter eggs hinting at what’s to come — not for the HomePod itself, but for the iPhone 8.
Developer Guilherme Rambo dug into the HomePod code in iOS 11.0.2 and discovered a “SmartCamera” feature that will automatically detect the scene or object you’re shooting and optimize its settings to take the best photo.
The scenes Rambo found in the HomePod code include fireworks, snow, sunrise, sunset and a bright stage — all difficult to shoot without granular controls. Apple also included a “FreezeMotion” feature that can detect fast-moving subjects like pets and babies.
As MacRumors noted, the iOS Photos app already uses facial detection and object recognition to help you sift through your images to find selfies and places. But it looks like Apple wants to expand that technology to the Camera app itself, which will make capturing specific scenes and subjects easier.
Pay With Your Face
Rambo also found data strings in HomePod’s code that reference something called “pearl,” which looks like it might be a codename for facial recognition. Rumor has it that Apple is looking to ditch the iPhone’s physical home button but is trying to figure out what to do with Touch ID. The firmware code indicates that Apple has developed a way to use facial recognition to authenticate payments and unlock phones, rendering a fingerprint sensor unnecessary.
If biometric scanning doesn’t pan out, Apple is also reportedly eyeing screens with fingerprint sensors embedded in them, although developer Steve Troughton-Smith dug through the iOS 11 beta and said there’s no reference to an under-the-display Touch ID sensor.
Another option is moving Touch ID to the back of the phone, although that’s an awkward design decision that Apple’s rivals like Samsung have gone with for their own edge-to-edge displays. It’s unlikely that Apple design chief Jony Ive would follow the crowd on that one.