Emergent Tech

Is it possible to control Amazon Alexa, Google Now using inaudible commands? Absolutely

Gizmo whisperers reveal their secrets

By Thomas Claburn in San Francisco

55 SHARE

Eavesdropping appliances like Amazon Echo and software assistants like Google Now can be attacked using mangled words that get interpreted as commands, but humans hear as nonsense.

As explained in a 2015 paper [PDF], the phrase "Cocaine Noodles," for example, can be heard by Google Now as its command invocation, "OK, Google."

But that's the sort of phrase nearby people might notice and wonder about. It's not a particularly subtle mode of attack.

More recently, two Princeton University researchers – assistant professor Prateek Mittal and graduate student Liwei Song – developed a covert way to address attentive gear.

In a paper published on Thursday titled "Inaudible Voice Commands" [PDF], the tech boffins describe a technique for playing commands that people cannot hear but devices can.

By playing back voice commands at a frequency outside of the range of human hearing, the two IoT whisperers managed to direct Google Now, running on a Nexus 5X with Android 7.1.2, to take a picture and to enable Airplane Mode.

They also inaudibly asked Alexa, the Amazon Echo software agent, to add milk to a shopping list and to speak the current weather conditions.

Despite missing the opportunity to secretly order "Cocaine Noodles" from Amazon.com, the pair achieved success rates of 100 per cent at three metres with the Android phone, and 80 per cent at a distance of two metres with the Amazon Echo.

The attack technique takes advantage of the fact that microphones create new frequencies through signal distortion, a consequence of nonlinear audio processing.

"The adversary plays an ultrasound signal with spectrum above 20 kHz, which is inaudible to humans," the paper explains. "Then the victim device's microphone processes this input, but suffers from nonlinearity, causing the introduction of new frequencies in the audible spectrum. With careful design of the original ultrasound, these new audible frequencies recorded by the microphone are interpreted as actionable commands by voice assistant software."

Device whispering has some limitations. Song, in an email to The Register, said the technique has not been tested on devices trained to recognize specific voices. "However, if we can obtain a recording file of a device owner's voice, we think our attack method still works," he said.

Also, the attack was conducted with a dedicated speaker – not the sort of thing one can sneak into a room easily – and it hasn't been demonstrated using a mobile phone as a sound source. "It is an open question of using phones for attacking, since many phones cannot transmit high-frequency ultrasounds," said Song.

There's a video, for those who go for that sort of thing. ®

Sign up to our NewsletterGet IT in your inbox daily

55 Comments

More from The Register

Amazon Alexa outage: Voice-activated devices are down in UK and beyond

That sound ... yes, that lack of sound ... it's here

You: 'Alexa, open Cortana.' Alexa: 'Who?'

Updated A year on, Alexa can look at your emails and Cortana can order groceries. World shrugs

Huawei's Alexa-powered AI Cube wants to squat in your living room too

IFA Get the White House on the line – it's not even cubic

'Alexa, find me a good patent lawyer' – Amazon sued for allegedly lifting tech of home assistant

University claims the Bezos Bunch nicked its ideas for language processing

'Alexa, listen in on my every word and send it all to a shady developer'

Amazon fixes up app security hole affecting always-listening Echo assistants

Buried in the hype, one little detail: Amazon's Alexa-on-a-chip could steal smart home market

Analysis But then again, it doesn't actually exist, so...

Alexa, please cause the cops to raid my home

Sour krauts after Amazon digital assistant throws wild midnight party – for itself

Hey Alexa, Siri and Cortana: Cisco says you’re bad at business

VID Borg thinks own Spark voice assistant knows how to behave in the office, but we've seen it and … meh

Alexa heard what you did last summer – and she knows what that was, too: AI recognizes activities from sound

Gadgets taught to identify actions via always-on mics

You know that silly fear about Alexa recording everything and leaking it online? It just happened

Updated US pair's private chat sent to coworker by AI bug