Is it possible to control Amazon Alexa, Google Now using inaudible commands? Absolutely

Gizmo whisperers reveal their secrets

By Thomas Claburn in San Francisco

Posted in Emergent Tech, 25th August 2017 20:17 GMT

Eavesdropping appliances like Amazon Echo and software assistants like Google Now can be attacked using mangled words that get interpreted as commands, but humans hear as nonsense.

As explained in a 2015 paper [PDF], the phrase "Cocaine Noodles," for example, can be heard by Google Now as its command invocation, "OK, Google."

But that's the sort of phrase nearby people might notice and wonder about. It's not a particularly subtle mode of attack.

More recently, two Princeton University researchers – assistant professor Prateek Mittal and graduate student Liwei Song – developed a covert way to address attentive gear.

In a paper published on Thursday titled "Inaudible Voice Commands" [PDF], the tech boffins describe a technique for playing commands that people cannot hear but devices can.

By playing back voice commands at a frequency outside of the range of human hearing, the two IoT whisperers managed to direct Google Now, running on a Nexus 5X with Android 7.1.2, to take a picture and to enable Airplane Mode.

They also inaudibly asked Alexa, the Amazon Echo software agent, to add milk to a shopping list and to speak the current weather conditions.

Despite missing the opportunity to secretly order "Cocaine Noodles" from Amazon.com, the pair achieved success rates of 100 per cent at three metres with the Android phone, and 80 per cent at a distance of two metres with the Amazon Echo.

The attack technique takes advantage of the fact that microphones create new frequencies through signal distortion, a consequence of nonlinear audio processing.

"The adversary plays an ultrasound signal with spectrum above 20 kHz, which is inaudible to humans," the paper explains. "Then the victim device's microphone processes this input, but suffers from nonlinearity, causing the introduction of new frequencies in the audible spectrum. With careful design of the original ultrasound, these new audible frequencies recorded by the microphone are interpreted as actionable commands by voice assistant software."

Device whispering has some limitations. Song, in an email to The Register, said the technique has not been tested on devices trained to recognize specific voices. "However, if we can obtain a recording file of a device owner's voice, we think our attack method still works," he said.

Also, the attack was conducted with a dedicated speaker – not the sort of thing one can sneak into a room easily – and it hasn't been demonstrated using a mobile phone as a sound source. "It is an open question of using phones for attacking, since many phones cannot transmit high-frequency ultrasounds," said Song.

There's a video, for those who go for that sort of thing. ®

Sign up to our NewsletterGet IT in your inbox daily

55 Comments

More from The Register

Alexa, please cause the cops to raid my home

Sour krauts after Amazon digital assistant throws wild midnight party – for itself

Hey Alexa, Siri and Cortana: Cisco says you’re bad at business

VID Borg thinks own Spark voice assistant knows how to behave in the office, but we've seen it and … meh

Audio spy Alexa now has a little pal called Dox

Updated You keep using that word, dox. It means more than you think it means...

Amazon mumbles into its coffee when asked: Will you give app devs people's Alexa chats?

Cloud giant worryingly coy about its intentions

Alexa muted, Twilio taps out, and Bitbucket kicks the, er, bucket amid AWS data center hiccup

Ah, the stable and reliable cloud

'Alexa, manage my enterprise storage'

No, says Tintri, it isn’t nonsense. (Psst, want to see a demo?)

Tech soap-opera latest: Alexa marries Cortana, will share custody of customers

Analysis Amazon, Microsoft agree to complete each other

Amazon: 'Alexa, how do you fix shoddy APIs that keep breaking apps? Asking, er, for a friend'

Updated Devs complain Skills Kit knackers their code

Alexa: How do I get free AWS? Simples: Build more of ME!

Amazon throws AWS credits at Alexa dev fans

Boffins throw Amazon Alexa on the rack to extract hidden clues

Investigators can look forward to better thumbscrews for making digital assistants squeal