If there's anything consumers have learned in recent years, it's that everything — even devices or institutions we believe to be secure — can be hacked. With the explosion of Internet of Things (IoT) devices in millions of households across America, there are many more opportunities for our personal information to be collected, shared and stolen without our knowledge.
According to a report from Consumer Intelligence Research Partners, there are more than 50 million (opens in new tab) Amazon Echo devices installed in the U.S. Alexa is the most ubiquitous digital assistant, but many users don't know what happens to their data when they ask Alexa to read them a recipe or check their bank balance.
- The best smart home devices to make your life easier
- Amazon Alexa settings to enable and disable
- How to see and delete Alexa's recordings of you
Amazon has repeatedly denied that Alexa-enabled devices are recording at all times, but the devices are always listening (opens in new tab) for the wake word ("Alexa" is one of several options) and will record and store what is said once Alexa is activated. Recordings capture a fraction of a second before the wake word is spoken, and end when the user request has been processed.
Smart assistants' hearing is still far from perfect, and Alexa's recent spate of random laughter is a good example of voice commands gone awry. Users reported that their Alexa-enabled devices laughed unprompted, which Amazon attributed to phrases that sounded similar to "Alexa, laugh" even when users didn't say the wake word.
Does Alexa record private conversations?
There are words that sound similar enough to "Alexa" that it's possible the device will pick up fragments of private conversations that are not intended to be a command, said Pam Dixon, executive director of the World Privacy Forum (opens in new tab), a nonprofit, public interest research group.
"The problem comes when people are not aware that recordings are stored until you delete them," she says. "People should think about what they are asking their voice assistants and know that they can delete that information."
According to an Amazon spokeswoman, Alexa recordings are stored securely on the Amazon Web Services cloud, and the company does not release customer information without a "valid and binding legal demand properly served on us."
In fact, Amazon reportedly refused to comply (opens in new tab) with a warrant for data from an Echo that police in Bentonville, Arkansas, believed to be evidence in a murder case.
"I really don't think these devices are listening and sending that data off to third parties all the time, but from reviewing my own recordings, there was lot more than I anticipated in there," Dixon said.
If you're concerned about the privacy and security of your personal information, there are a few steps you can take to secure your Alexa-enabled device.
Protect your home network
Your smart-home devices are only as secure as the network they connect to. Start by changing the default name and password for your wireless network — don't include identifying information in either — and enable the Wi-Fi Protected Access II (WPA2) protocol on your router.
If possible, create one Wi-Fi network for your smart-home devices and another for devices you use to bank, shop or browse, and set up a firewall to restrict what — and who — can connect. Regularly check for and install firmware updates on all of your gadgets, including Alexa devices.
Change the Alexa wake word
The first step Dixon recommends users take on their Alexa-enabled devices is to change the word that activates recording.
For now, you can use "Amazon," "Computer" or "Echo" instead. Choose the word that you're least likely to use in everyday conversation, so Alexa will record only if you speak directly to it.
It's important to remember that Alexa-enabled devices may be able to pick up strangers' voices through closed windows and doors, Dixon adds. You can also turn off the device's microphone to stop it from listening entirely.
Strengthen your Amazon password
Anyone with access to your Amazon account can listen to, share or delete your Alexa voice-recording history on the Manage Your Content and Devices dashboard (opens in new tab). This includes family members who order items under the same username, but your information might also be vulnerable to hackers who obtain your Amazon password.
The commands you give Alexa — arming your security system, requesting directions and commute times, or calling friends — can provide malicious actors with valuable information about your daily routine, which can put your personal safety, and that of your home and family, at risk. Just as you would with any other login, follow good password hygiene recommendations.
Delete old Alexa recordings, especially those with sensitive information
While asking Alexa to set a timer or play cat noises is fairly innocuous, saved recordings that include sensitive health, legal or financial information are less so.
Dixon says most users don't think about the consequences of having their conversations or requests stored indefinitely where others can access them. Recordings may resurface in divorce or child- custody cases, for example.
"If you have any kind of questions or any kind of misgivings about having recordings around, just delete them," Dixon said. "The idea is just like clearing the web history in a web browser."
To listen to and delete stored recordings, open Settings > History in the Alexa app, or use the dashboard at Amazon.com. Recordings will remain on the Amazon cloud forever until you remove them.
Deleting all old recordings can degrade Alexa's performance (opens in new tab) slightly because the device uses your history to improve responses over time, and it will have to relearn patterns if information is lost.
If you don't want to mass-delete all recordings about local weather or music requests, you can selectively remove more sensitive material. You can also make sure that your Alexa device isn't being used to test new features, which Amazon employees or contractors are more likely to listen to.
Read third-party Alexa skill privacy policies
Third-party Alexa skills, of which there are tens of thousands (opens in new tab), may also collect users' personal information. Amazon requires developers (opens in new tab) of these skills to provide links to their privacy policies on the skill detail pages, but consumers are responsible for digging through each to understand how data is collected, used and stored.
"Everyone is very worried about having any kind of a data breach or leak on these home devices," Dixon says. "No company right now wants to have a privacy issue — caution is the byword going forward."