Apple: It takes 30 child abuse images to trigger a warning
Apple adds details to its iPhone photo scanning policy as privacy concerns grow
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Daily (Mon-Sun)
Tom's Guide Daily
Sign up to get the latest updates on all of your favorite content! From cutting-edge tech news and the hottest streaming buzz to unbeatable deals on the best products and in-depth reviews, we’ve got you covered.
Weekly on Thursday
Tom's AI Guide
Be AI savvy with your weekly newsletter summing up all the biggest AI news you need to know. Plus, analysis from our AI editor and tips on how to use the latest AI tools!
Weekly on Friday
Tom's iGuide
Unlock the vast world of Apple news straight to your inbox. With coverage on everything from exciting product launches to essential software updates, this is your go-to source for the latest updates on all the best Apple content.
Weekly on Monday
Tom's Streaming Guide
Our weekly newsletter is expertly crafted to immerse you in the world of streaming. Stay updated on the latest releases and our top recommendations across your favorite streaming platforms.
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
It’s been a confusing several days since Apple first announced its intention to scan photos uploaded to iCloud for known images of Child Sexual Abuse Material (CSAM).
Privacy advocates have objected in strong terms to the move which would see scanning performed on the hardware itself, before being uploaded to iCloud. To confuse things further, Apple said in its FAQ [PDF] that this functionality would essentially be disabled if users chose not to use iCloud. The move, privacy campaigners fear, could lead to pressure from authoritarian governments for Apple to expand the functionality to help crack down on dissident activity.
- iOS 15 release date, beta, supported devices and all the new iPhone features
- Cloud storage vs cloud backup vs cloud sync: what's the difference?
- PLUS: Facebook Messenger gets end-to-end encryption
In a bid to take the sting out of the controversy, Apple has issued some clarifications. As Reuters reports, Apple now says that its scanner will only hunt for CSAM images flagged by clearinghouses in multiple countries, and that it would be simple for researchers to check that the image identifiers are universal across devices, to prove that it couldn’t be adapted to target individuals.
The company also added that it would take 30 matched CSAM images before the system prompts Apple for a human review, and any official report could be filed. This, in part, explains why Apple felt it could promise the chance of a false positive being less than one in a trillion per year.
Apple refused to say whether these were adjustments made in the face of criticism or specifics that were always in place, though it did add that as a policy still in development, change should be expected.
Nonetheless, privacy advocates believe they’re making a difference. “Even if they don't ultimately nix the plan, we're forcing them to do the work they should've done by consulting us all along,” tweeted Stanford University surveillance researcher Riana Pfefferkorn. “Keep pushing.”
Most recently, Apple VP of software engineering Craig Federighi told the Wall Street Journal that Apple’s new policies are “much more private than anything that's been done in this area before.”
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
“We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world,” he said. Adding that the system had been developed “in the most privacy-protecting way we can imagine and in the most auditable and verifiable way possible,” he painted the company’s solution as preferable to its cloud storage rivals, which look and analyze “every single photo.”
Federighi argued that critics don’t fully understand Apple’s implementation, and believes that the company is partly to blame for not explaining things clearly. Announcing CSAM scanning at the same time as its protections for minors using iMessage meant the two were erroneously conflated, he conceded.
"We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing,” he said.
The “we” in that sentence may imply more uniform support within the company than is actually present. On Friday, Reuters revealed that the move had proved equally divisive within the company, with more than 800 messages about the plan appearing on the company’s internal Slack.
- More: Apple Child Safety photo scanning — how it works and why it's controversial
Freelance contributor Alan has been writing about tech for over a decade, covering phones, drones and everything in between. Previously Deputy Editor of tech site Alphr, his words are found all over the web and in the occasional magazine too. When not weighing up the pros and cons of the latest smartwatch, you'll probably find him tackling his ever-growing games backlog. He also handles all the Wordle coverage on Tom's Guide and has been playing the addictive NYT game for the last several years in an effort to keep his streak forever intact.

