Facebook will reportedly start scanning its pages for child porn and missing children according to chief technology officer Bret Taylor in a statement released on Thursday. In conjunction with Microsoft and the National Center for Missing and Exploited Children (NCMEC), the popular social website will use photo "fingerprint" technology called PhotoDNA to search for pictures matching those stored in the NCMEC database.
"Our hope and belief is that Facebook will be just the first of many [companies to use what has proven to be highly effective technology]," said Ernie Allen, chief executive of the National Center for Missing & Exploited Children. "Online services are going to become a hostile place for child pornographers and pedophiles."
Allen assured Facebook users that the privacy and free-speech rights of adult pornography consumers won't be violated, and that the technology will only be used to find and remove known images of sexually exploited pre-pubescent children (those under 12). These are the "worst of the worst" he said, and are apparently shared over and over. Some even include snapshots of infants of toddlers.
"These are crime scene photos, not porn," Allen said. "This tool is essential to protect these victims and to prevent, to the greatest degree possible, the redistribution of their sexual abuse."
According to the New York Times, Microsoft’s PhotoDNA tech has been refined to the point of identifying these "worst of the worst" images even if they have been reduced, cropped or altered in any other fashion. It can also cull through large amounts of data quickly and accurately "enough to police the world’s largest online services."
Microsoft already implemented PhotoDNA into its search engine Bing and its online file storage service Skydrive. Microsoft Digital Crimes Unit associate general counsel Bill Harmon said that PhotoDNA has already evaluated more than two billion digital pictures across both services, uncovering 1,000 matches on SkyDrive and 1,500 matches through Bing image indexing.
"PhotoDNA identified horrific images on our services that we would have never found otherwise," Harmon said. "[With Facebook among the world's largest photo-sharing services], their participation in the PhotoDNA program will significantly expand the program's impact."
Before implementing Microsoft's PhotoDNA tech, Facebook primarily relied on user feedback. These abuse reports were reviewed by trained employees who in turn sought out and manually deleted the offending images. But with PhotoDNA in place, the offending images won't even hit the Facebook server hard drives. As it stands now, 200 million images are uploaded by Facebook users each day.
"We’ve found it to be a very powerful tool in identifying these images," Chris Sonderby, Facebook’s assistant general counsel said.
Thursday Facebook said that it will host an online event at 3:00 p.m. (Eastern time) on Friday to explain the initiative.