Earlier this year, we found out about Alexa, then later Google Assistant. Now it turns out Apple's voice assistant Siri also has humans listening to your queries. An Apple contractor shared Siri's quality-control practices with The Guardian (opens in new tab) last week, raising concerns about Apple's privacy policies.
The unnamed contractor confirmed Apple hired the contractor's employer to review a small portion of saved Siri recordings and grade Siri's responses on a number of factors, including whether the voice assistant offered satisfactory answers or whether the activation was accidental.
After listening to several short, private recordings picked up by what were likely unintended triggers, the anonymous contractor came forward with concerns about lack of disclosure.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the whistleblower told The Guardian. "These recordings are accompanied by user data showing location, contact details and app data.”
In its consumer-facing privacy documentation (opens in new tab), Apple doesn't explicitly state that Siri recordings are sometimes reviewed by humans as a quality-control measure. The company only notes that the data “is used to help them recognize your pronunciation and provide better responses."
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user's Apple ID," Apple told The Guardian. "Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
The tech giant also noted that less than 1% of daily Siri interactions are reviewed by humans.
Unlike Amazon's Alexa, Siri operates on devices that users have on their persons every day. The Guardian's anonymous source highlighted the Apple Watch (which makes up 35% of the smartwatch market (opens in new tab)) as a common source for the quality-control recordings.
“The regularity of accidental triggers on the watch is incredibly high,” they said to The Guardian. “The watch can record some snippets that will be 30 seconds -- not that long, but you can gather a good idea of what’s going on.”
Apple prides itself on its reputation for privacy, and even uses privacy as a selling point to distinguish itself from Amazon, Facebook and Google. But the company offers no way for you to opt out of Siri's quality-control grading, aside from disabling Siri entirely.
If you do use Siri, there's not much you can do to avoid having your recordings saved, but this revelation serves as yet another reminder that when you have voice and home assistants, you're sacrificing your privacy.