Ever, a photo storage app and website, reportedly used millions of uploaded user photos to train facial recognition technology without proper disclosure, then put the technology up for sale to third-party entities, including law enforcement and the military.
NBC News spoke with several Ever users who uploaded their photos to the site, and most had no idea that their photos were being used for a side project. Those photos helped train a facial recognition algorithm sold under the brand Ever AI. Founded in 2013, Ever AI has contracts with SoftBank Robotics, the company behind a robot capable of recognizing human emotions.
"To be absolutely clear, no user information of any kind is provided from our Ever app to our enterprise face-recognition customers," Aley told the Register. "That means that no user images are provided, and no information derived from those images, such as vectors or mathematical representations of the images, are provided to our enterprise customers."
Ever AI has not signed any contracts with law enforcement or military entities, but the company promises that it can "enhance surveillance capabilities" and "identify and act on threats," according to NBC News.
Ever AI claims its facial recognition software is 99.84% accurate, making it one of the most sensitive products on the market. Using its facial recognition technology, the company offers "attribute identification services," which can determine someone's name, ethnicity, age, emotion, gender and even location.
Aley reportedly told NBC News that Ever shifted to creating facial recognition technology when it became clear that a free photo storage wasn't as lucrative as the company had hoped.
Ever wouldn't be the first company to use uploaded photo to generate facial recognition AI without proper consent. Earlier this year, tech giant IBM was caught using Flickr photos for facial recognition training.