Apple just delayed iPhone photo scanning program following backlash
Apple won't scan iPhones for child-sex images just yet
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Daily (Mon-Sun)
Tom's Guide Daily
Sign up to get the latest updates on all of your favorite content! From cutting-edge tech news and the hottest streaming buzz to unbeatable deals on the best products and in-depth reviews, we’ve got you covered.
Weekly on Thursday
Tom's AI Guide
Be AI savvy with your weekly newsletter summing up all the biggest AI news you need to know. Plus, analysis from our AI editor and tips on how to use the latest AI tools!
Weekly on Friday
Tom's iGuide
Unlock the vast world of Apple news straight to your inbox. With coverage on everything from exciting product launches to essential software updates, this is your go-to source for the latest updates on all the best Apple content.
Weekly on Monday
Tom's Streaming Guide
Our weekly newsletter is expertly crafted to immerse you in the world of streaming. Stay updated on the latest releases and our top recommendations across your favorite streaming platforms.
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
Apple has reportedly decided to delay its controversial upcoming program to scan iPhones for child-sexual-abuse material (CSAM).
"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," Apple said in a statement emailed to reporters.
"Based on feedback from customers, advocacy groups, researchers & others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
- Apple Child Safety photo scanning — how it works and why it's controversial
- The best encrypted messaging apps
- Plus: iPhone 13 release date, price, specs and leaks
Apple's plan, announced last month, was to simultaneously scan iPhones and iCloud for known CSAM images. It was supposed to be implemented by the end of 2021 as an update to iOS 15, which itself is likely to be rolled out in September or October.
The rather complicated system would use artificial intelligence to look at every image in an Apple user's photo library and match them to a database of CSAM images provided by the National Center for Missing & Exploited Children (NCMEC).
If a total of 30 CSAM matches were to be found both on a user's iPhone and in the same user's iCloud Photos folder, then the system would flag that Apple account and the image for human review.
Apple said CSAM scanning preserves privacy
Apple has defended its program as protecting user privacy while at the same time protecting abused children.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
"We ... see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world," Apple VP of Software Engineering Craig Federighi told The Wall Street Journal.
Federighi added that the scanning system was designed "in the most privacy-protecting way we can imagine and in the most auditable and verifiable way possible."
Apple's position is that other cloud-storage companies already scan uploaded user images for CSAM without notifying users, but that Apple does not and will not until its system is ready to be implemented. Apple does scan iCloud Mail for CSAM, however.
Privacy advocates were not convinced
Despite the reassurances, the announcement was met with a huge outcry from privacy advocates and technology-policy experts. The Electronic Frontier Foundation called Apple's program "mass surveillance," and it joined the ACLU, the Center for Democracy and Technology and dozens of other groups in writing a letter to Apple CEO Tim Cook asking him to drop the program.
"We thought our devices were ours, and Apple had taken pains during Apple v. FBI to say, 'Your device is yours. It doesn't belong to us,'" said Riana Pfefferkorn, a research scholar at Stanford University's Center for Internet and Society, in an interview with the Verge. "Now it looks like, well, maybe the device really is still Apple's after all, or at least the software on it."
There's some speculation in the tech community that Apple may have planned to implement the CSAM-scanning program to satisfy law-enforcement authorities.
According to a Reuters report in early 2020, the company had reportedly in 2018 planned to fully encrypt iCloud backups of iPhones so that even Apple could not see them, but didn't do so because the FBI said it would hamper criminal investigations.
"I think there's some kind of political strategizing going on behind the scenes here," said Jen King of the Stanford University Institute for Human-Centered Artificial Intelligence to the Verge. "If they are trying to take a bigger stand on encryption overall, that this was the piece that they had to give up to law enforcement in order to do so."
When asked by Tom's Guide whether there might be some quid-pro-quo deal between Apple and the U.S. Department of Justice regarding CSAM scanning, an Apple spokesperson had no comment.

Paul Wagenseil is a senior editor at Tom's Guide focused on security and privacy. He has also been a dishwasher, fry cook, long-haul driver, code monkey and video editor. He's been rooting around in the information-security space for more than 15 years at FoxNews.com, SecurityNewsDaily, TechNewsDaily and Tom's Guide, has presented talks at the ShmooCon, DerbyCon and BSides Las Vegas hacker conferences, shown up in random TV news spots and even moderated a panel discussion at the CEDIA home-technology conference. You can follow his rants on Twitter at @snd_wagenseil.
