Skip to main content

Amazon Echo security loophole exploited to make them hack themselves

Amazon Echo 4th gen
(Image credit: Future)

If you've ever worried about the security of your smart-home devices, then the latest Amazon Echo security exploit won't make you feel any better. Researchers (opens in new tab) have found a way to force Echo speakers to hack themselves, so to speak, using the devices' own speakers to issue voice commands. 

Researchers at Royal Holloway University in London and the University of Catania in Sicily found that it was possible to get an Alexa speaker to perform any number of functions by playing commands through the speaker itself. 

Dubbed "Alexa vs Alexa," the hack could be performed with only a few seconds of proximity to a vulnerable Echo device. Researchers were able to use voice commands to pair an Echo with a Bluetooth device, and provided the Bluetooth device stayed within range, attackers could use that device to issue the Echo commands.

So long as the command included the wake-up word (Alexa or Echo), an exploited Echo could be made to buy products, control smart home devices, and even unlock doors. Researchers even added a single “yes” command that would automatically play after six seconds, just in case the Echo needed a verbal confirmation before continuing.

Another version of the attack used malicious skills or radio stations to "infect" the Echo and make it vulnerable to attackers’ voice commands. A third exploit also enabled a skill that ran silently, with the attacker intercepting and replying to commands as if they were talking to Alexa.

In these cases, attackers could use text-to-speech apps to stream voice commands to the Echo. The attack also relied on something called the “Full Voice Vulnerability,” which prevented the Echo from automatically lowering its volume once it heard the wake-up command.

Fortunately, in response to the research, Amazon has issued a number of patches to fix several of these weaknesses. That means it’s no longer possible to use Alexa skills to self-wake devices, which the Bluetooth attack relied on, or use radio stations to deliver self-issued commands.

Amazon also emphasized that it has "systems in place to continually monitor live skills for potentially malicious behavior including silent re-prompts." Any Alexa skills that attempt to offer this are either blocked during certification, or "quickly deactivated."

But, despite all the measures in place to prevent misuse, this is not the first time Echos and other voice assistants have been caught out like this. Past cases include workers being able to listen to user audio and approved apps (opens in new tab) eavesdropping on users in an attempt to phish for passwords, for instance.

On this occasion, researchers caught the vulnerability ahead of time, and Amazon fixed the major issues, but that isn't always the case.

So if you have one of the best Alexa devices in your home, you might want to follow the advice of the researchers who uncovered this particular set of problems and mute your device when it's not in use. And don't miss our own list of 5 ways to secure your Alexa device.

Tom Pritchard
Automotive Editor

Tom is the Tom's Guide's Automotive Editor, which means he can usually be found knee deep in stats the latest and best electric cars, or checking out some sort of driving gadget. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining that Ikea won’t let him buy the stuff he really needs online.