Apple has reportedly decided to delay its controversial upcoming program to scan iPhones for child-sexual-abuse material (CSAM).
"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," Apple said in a statement emailed to reporters.
"Based on feedback from customers, advocacy groups, researchers & others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
- Apple Child Safety photo scanning — how it works and why it's controversial
- The best encrypted messaging apps
- Plus: iPhone 13 release date, price, specs and leaks
Apple's plan, announced last month, was to simultaneously scan iPhones and iCloud for known CSAM images. It was supposed to be implemented by the end of 2021 as an update to iOS 15, which itself is likely to be rolled out in September or October.
The rather complicated system would use artificial intelligence to look at every image in an Apple user's photo library and match them to a database of CSAM images provided by the National Center for Missing & Exploited Children (NCMEC).
If a total of 30 CSAM matches were to be found both on a user's iPhone and in the same user's iCloud Photos folder, then the system would flag that Apple account and the image for human review.
Apple said CSAM scanning preserves privacy
Apple has defended its program as protecting user privacy while at the same time protecting abused children.
"We ... see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world," Apple VP of Software Engineering Craig Federighi told The Wall Street Journal.
Federighi added that the scanning system was designed "in the most privacy-protecting way we can imagine and in the most auditable and verifiable way possible."
Apple's position is that other cloud-storage companies already scan uploaded user images for CSAM without notifying users, but that Apple does not and will not until its system is ready to be implemented. Apple does scan iCloud Mail for CSAM, however.
Privacy advocates were not convinced
Despite the reassurances, the announcement was met with a huge outcry from privacy advocates and technology-policy experts. The Electronic Frontier Foundation called Apple's program "mass surveillance," and it joined the ACLU, the Center for Democracy and Technology and dozens of other groups in writing a letter to Apple CEO Tim Cook asking him to drop the program.
"We thought our devices were ours, and Apple had taken pains during Apple v. FBI to say, 'Your device is yours. It doesn't belong to us,'" said Riana Pfefferkorn, a research scholar at Stanford University's Center for Internet and Society, in an interview with the Verge. "Now it looks like, well, maybe the device really is still Apple's after all, or at least the software on it."
There's some speculation in the tech community that Apple may have planned to implement the CSAM-scanning program to satisfy law-enforcement authorities.
According to a Reuters report in early 2020, the company had reportedly in 2018 planned to fully encrypt iCloud backups of iPhones so that even Apple could not see them, but didn't do so because the FBI said it would hamper criminal investigations.
"I think there's some kind of political strategizing going on behind the scenes here," said Jen King of the Stanford University Institute for Human-Centered Artificial Intelligence to the Verge. "If they are trying to take a bigger stand on encryption overall, that this was the piece that they had to give up to law enforcement in order to do so."
When asked by Tom's Guide whether there might be some quid-pro-quo deal between Apple and the U.S. Department of Justice regarding CSAM scanning, an Apple spokesperson had no comment.