It doesn't take a security expert to come to the conclusion that voice search programs like Siri and Google Now represent possible avenues of attack. A team of French researchers has confirmed that suspicion, successfully sending commands to smartphones, wordlessly, from up to 16 feet (5 meters) away. A clever hacker could take advantage of these signals to eavesdrop on unsuspecting users, or inundate them with phishing scams.
Wired gave mainstream coverage to the experiment, which first appeared at the Hack in Paris conference in June in a paper entitled "IEMI Threats for Information Security: Remote Command Injection on Modern Smartphones." The security researchers were able to connect to nearby phones via voice commands, provided the devices were attached to headsets with microphones.
The hack works like this: Headphones with built-in microphones pick up a certain frequency of radio signals. (If they didn't, they would not be able to transmit a user's voice.) Broadcasting from a radio transmitter within 16 feet of the target device, the researchers were able to use the headphones as an antenna to pick up signals. These signals could issue commands to the smartphones via Siri or Google Now without the researchers ever uttering a word.
This attack could work, in theory, because many users enable voice commands from their phones' lock screens (or do not use a lock screen at all). In fact, iPhones come with Siri enabled on the lock screen by default, and Siri does not recognize individual user voices. Without this security precaution in place, the researchers were able to trick the phone into accepting commands that mimicked voice commands to dial certain numbers or navigate to certain websites.
The potential for mischief with such an attack is higher than you might expect. By directing users to malicious websites, it would be possible to infect a phone with invasive malware. Dialing scam numbers or sending texts to expensive SMS services could quickly rack up bogus charges, or a third party could simply make a call and eavesdrop on whatever a victim has to say.
The attack is only a proof-of-concept, and it's unlikely that anyone has attempted it in the wild. Even so, if you want to protect yourself against it, the solution is simple: Enable a lock screen, and disable voice commands until after you enter your PIN or password. In theory, these are things you should be doing anyway in order to keep your phone safe.