Alexa and Google Assistant Speakers Open to Laser Hack Attacks
Focused light can trigger fake voice commands
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Daily (Mon-Sun)
Tom's Guide Daily
Sign up to get the latest updates on all of your favorite content! From cutting-edge tech news and the hottest streaming buzz to unbeatable deals on the best products and in-depth reviews, we’ve got you covered.
Weekly on Thursday
Tom's AI Guide
Be AI savvy with your weekly newsletter summing up all the biggest AI news you need to know. Plus, analysis from our AI editor and tips on how to use the latest AI tools!
Weekly on Friday
Tom's iGuide
Unlock the vast world of Apple news straight to your inbox. With coverage on everything from exciting product launches to essential software updates, this is your go-to source for the latest updates on all the best Apple content.
Weekly on Monday
Tom's Streaming Guide
Our weekly newsletter is expertly crafted to immerse you in the world of streaming. Stay updated on the latest releases and our top recommendations across your favorite streaming platforms.
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
An unmarked van pulls up outside your house. A laser beam shoots out of the van's back window, through your living-room window and onto the microphone of your Google Home or Amazon Echo smart speaker.
The speaker says, "OK, opening garage door." The garage door lifts and a gang of thieves enters your house. Or the speaker says, "OK, unlocking and starting car," and a thief climbs into your Tesla and drives away.
Sound like the opening to the next Purge movie? It could really happen, say a team of American and Japanese researchers.
The researchers discovered that precisely modulated lasers could silently send "voice" commands to smart speakers from hundreds of feet away. The attack also worked on smartphones and on an iPad, but only at short distances.
How can you defend yourself if this starts happening in real life? The best bet is to make sure your Amazon Echo, Google Home, Facebook Portal, Amazon Fire TV Cube and other smart speakers aren't facing windows. Putting black tape over the microphone may not work because a high-powered laser beam could shine, or even burn, right through.
The technical details
These attacks work because the microphones on smart speakers and on smartphones are actually printed-circuit chips with two layers, a flexible membrane and a stiff backplate, through which electric charges flow.
Sound waves cause the membrane to flex and vary its distance from the backplate, and the resulting changes in electric capacitance are registered by the backplate, which converts the changes into an electric signal. The smart speaker or smartphone interprets this signal as sound.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
But a laser beam can short-circuit this process, although it's not yet clear exactly how. It may be that the laser creates the same sorts of changes in electric capacitance on the microphone's backplate as sound would. Or it may be that the laser heats up the air around the microphone to move the membrane.
In any case, the microphone will think there is sound, even if there is none, and send the resulting signal to the device's CPU.
You can do this too for less than $700
The researchers, from the University of Michigan and the University of Electro-Communications in Tokyo, found that a setup involving a laptop, a standard photographer's tripod, a $30 audio amplifier, a $350 laser "driver" and a cheap laser pointer ($15-$20) could be used modulate the laser beam to mimic actual voice commands. That's enough equipment to send fake voice commands to microphones a few feet away.
Add a telephoto lens -- the researchers used a $200 one -- and you can send that laser beam hundreds of feet and have it still activate smart speakers. Excluding the laptop, which plays recorded voice files to the laser driver, the entire setup costs about $600-$700.
The researchers climbed to the top of a bell tower on the Michigan campus and were able to control a Google Home speaker on the fourth floor of a building more than 200 feet away.
Right now, most Google Home and Amazon Echo devices are essentially defenseless against this sort of attack. They don't check for voice recognition by default -- anyone can give them a voice command if they just say "Alexa" or "OK, Google."
Smartphones and tablets, which unlike smart speakers do tend to leave the house, are a bit better protected. The device owner often has to register his or her voice with the device in order to trigger voice commands.
You can optionally turn on voice-recognition requirements on smart speakers. However, in those cases, only the wake words -- "Alexa," "Hey, Siri," or "OK, Google" -- need to be in the owner's voice. The command that follows the wake words can be in any voice.
The researchers are working with Amazon, Facebook and Google, as well as Tesla, to develop ways to stop these attacks. (Their paper, available here, found that Ford vehicles, like Teslas, were vulnerable to this attack through linked smart speakers.)
Possible solutions would be to have smart speakers respond to voice commands with varied questions to the user before the command can be carried out. Or future generations of smart speakers could have more than one microphone and require that commands be audible on all of the microphones.
But in the meantime, move those smart speakers away from the window.

Paul Wagenseil is a senior editor at Tom's Guide focused on security and privacy. He has also been a dishwasher, fry cook, long-haul driver, code monkey and video editor. He's been rooting around in the information-security space for more than 15 years at FoxNews.com, SecurityNewsDaily, TechNewsDaily and Tom's Guide, has presented talks at the ShmooCon, DerbyCon and BSides Las Vegas hacker conferences, shown up in random TV news spots and even moderated a panel discussion at the CEDIA home-technology conference. You can follow his rants on Twitter at @snd_wagenseil.
