Update 4:00 pm ET: Google is investigating the leaks that led to this report. The company announced in a blog post that it is "conducting a full review of our safeguards in this space to prevent misconduct like this from happening again."
If you've got a Google Home speaker at home, or you use the Google Assistant smartphone app, your privacy may be at risk.
According to a new report from Belgian broadcaster VRT NWS, Google employees listen to sound files from your Google Home speakers and the Assistant app to improve Google's search engine. These include both purposeful and unconscious recordings that can contain sensitive information.
It shouldn't be news to anyone that Google stores your recordings; this is in Google's terms and conditions, and it's how the voice assistant improves.
But you might be surprised to learn that Google's employees listen to recordings, and that they might hear more than you think.
MORE: 5 Ways to Secure Your Google Home Device
When Google Assistant has difficulty making out a command, employees check the AI-generated script against what they hear in the clip. They also augment the script with qualitative data, including characteristics of the speaker's voice, what they're saying and how they say it.
Despite the fact that recordings are anonymized, in the recordings it heard, VRT NWS was able to make out names, addresses, and other identifying information that made it very easy to track down the speakers.
The reporters even played back some of the recordings for the people who had made them. The recordings were provided by a Google contractor who passed along the recordings without Google's permission.
The devices also caught some mistaken recordings (sometimes Google thinks it hears its name when you've said a similar word) that contained all sorts of sensitive material.
In a statement to the Dutch broadcaster NOS, which also ran the story, Google said (via Google Translate):
""We work with language experts around the world to improve speech technology by making transcripts of a small number of audio clips. This work is crucial for the development of technology that makes products such as the Google Assistant possible.
Language experts judge only about 0.2 percent of all audio clips that are not linked to personally identifiable information.
We have recently learned that one of these language experts may have violated our data security policy by leaking Dutch-language audio clips. We are actively investigating this and when we find a breach of our policy, we will take action quickly, up to and including the termination of our agreement with the partner. "
If you're a Google Home owner and you're terrified, don't worry; there's a way to prevent employees from snooping on you.
If you're worried Google might have recorded something it shouldn't have, you can delete individual Google Home recordings (or your entire history) from your account. And make sure your device's microphone is off when you're discussing anything personal.