Voice assistants are always listening. So why won't they call police if they hear a crime?

We've given away our privacy for the wrong rewards

If you saw someone being assaulted, you'd probably whip out your phone and dial for help.

But when one of our newly ubiquitous devices hears a crime, it does nothing. If Alexa or Google Assistant or Siri hears an assault, or a rape, they sit there waiting for their cues to act.

But that's a load of malarkey. Listening for a cue means the devices are already listening to everything.

How closely these devices listen is a design choice: manufacturers want them to do less work, consume fewer computing resources, preserve bandwidth and battery life.

I wonder if those choices are good enough when that chatter they hear is no longer idle? When people are screaming for help or crying in fear, should our devices simply ignore that because it doesn't match a particular pattern or make for a good user experience?

Let's be honest: these things are designed to spy on us, so saying that they'll spy on our wants - that's what they're really all about - without responding to our actual needs for safety and security seem like an intentional set of bad design choices.

These devices, or the cloud services that power them, can easily understand when someone is angry, or terrified or in pain. It should almost be trivial to detect when something is way out of range, and flag that.

But this has never been about detection: this is all about responsibility. When the facade drops and these devices are seen as the potential lifesavers they could be for people in trouble, we will demand the makers wire them into the emergency services.

Robot maid

Talk down to Siri like it's a mere servant – your safety demands it


That won't be hard to achieve, but the decision to do so will changes the character of these devices from something that's all about want and desire into something that's also about need and safety and security and being watched over - all the time.

It's a clever little trick the makers of these devices have pulled - making us believe they can do one thing and not another very obvious thing. The marketing for all of these devices has framed them as the fulfilment of desire, just by using our voices. The reality of surveillance - and the "burden of omniscience" that comes with it - well, that's not as appetizing. It's not something any vendor wants to own.

But do they have a choice? If someone is attacked in their home, and Alexa or Siri hears it but do nothing, will Amazon's or Apple's hands be clean? That's a question that's already gone before the courts. Now that tens of millions of these connected speakers crowd our homes, those situations will occur with ever-greater frequency.

Listening means being responsible for whatever you hear.

We're walking a fine line here between devices that act like psychopaths - who don't care and won't respond when someone is being assaulted - and a world where all of these listening devices make us feel profoundly unfree. There's no easy way through this, but that doesn't mean we can just toss this mess into the 'too hard' bin. We're listening as never before, and we have to do something about it. ®

Sponsored: Minds Mastering Machines - Call for papers now open

Biting the hand that feeds IT © 1998–2018