This episode of Black Mirror sucks: London cops boast that facial-recog creepycams will be on the streets this year

Here's what Met top brass told the press this morning

Got Tips? 87 Reg comments
Policeman claps in London street

In depth London's Metropolitan Police is to start routinely deploying facial-recognition tech across the English capital despite legal and public opinion challenges, the force declared this morning.

The so-called AFR (Automated Facial Recognition) technology, supplied by NEC Global, will be deployed on London's streets to monitor innocent people going about their business – and, so the police hope, to also catch criminals.

"Live Facial Recognition is a fantastic crime-fighting tool," an enthusiastic assistant commissioner Nick Ephgrave told the nation's press (and El Reg) this morning, comparing it to "showing photographs to [constables] before they go out on patrol".

A spokesman from tech supplier NEC told the briefing that its own internal trials showed a 70 per cent accuracy rate without giving details of how this was established. Independent trials, as well as the Met's own test deployments, showed an inaccuracy rate closer to 98 per cent – not that this is stopping PC Plod.

He continued: "An important thing to remember, I think, anyway from my perspective, is live facial recognition makes no decisions. All [it] does is suggest to an officer that a person there might be wanted for GBH or attempted murder or whatever."

A failed legal challenge to South Wales Police's deployment of AFR tech now, ironically, forms a significant part of the legal underpinning for the Met's deployments despite British academics previously questioning both the accuracy and legality of the tech.

Ephgrave said: "We ended up with a very helpful ruling in the Bridges case to give us a strong legal mandate to go ahead with this."

Edward Bridges argued the deployment of AFR in Cardiff during 2017 and 2018 was illegal. The High Court ruled last year that it wasn't. That decision is now being appealed against, with a Court of Appeal hearing scheduled for June.

Responding to this morning's news, civil liberties campaign group Privacy International (PI) told The Register that AFR in London was a "radical and dystopian" idea.

Edin Omanovic, PI's advocacy director, told us: "The Met's ambitious plans are completely out of sync with what we are seeing elsewhere. In the US, cities such as San Francisco have already outlawed the use of facial-recognition technology by police and other government departments, while just last week a European Commission white paper suggested a temporary ban in order to properly assess the human rights impact of this intrusive technology."

The Information Commissioner's Office said in a canned statement: "We expect to receive further information from the Metropolitan Police Service regarding this matter in forthcoming days," and called for the government to issue a statutory code of practice for using AFR.

What's going to happen?

The Met's creepycam setup will consist of two cameras hooked up to a van along with a bunch of coppers wielding devices connected to the NEC image-analysis tech in the van. When the magical black box goes "beep", police will be free to independently decide they're going to arrest the guilty bastard suspect person of interest.

The van may or may not have police and CCTV markings on it, though Ephgrave was at pains to say that cops will be putting up signs in the general area of a deployment as well as handing out leaflets to tell people the equivalent of: "Smile, you're being scanned by China-style surveillance tech." This is exactly what police were doing when they arrested and fined someone who hid his face from a "trial" deployment of AFR last year.

As for future AFR arrests, in response to robust questioning about whether a match from the AFR system would instantly trigger a handcuffing, Ephgrave said: "The point I want to get across: the decision always sits with the officer."

Commander Mark McEwan added, referring to previous discussions about looking at printed photos of suspects: "Yes, it [an AFR match] does start that journey of building reasonable ground [for stop and search or arrest] in the same way if I had looked on my laptop, on the briefing pages, before I went out on the street."

Each "bespoke" local deployment has its own watchlist, allegedly custom-assembled for each specific occasion. In the past the largest of these lists has held around 2,500 suspects, though NEC's Neoface system can handle lists of up to 10,000 suspects for cross-matching. Each time the creepycams are wheeled out, the watchlists will be based on the Met's own lists of suspects – and not, it appears, any national databases. For now.

Ephgrave said: "We are going to use [AFR] to focus in on serious violent offenders," as well as crooks who fail to turn up at court and missing children. "The locations we choose to deploy it will be based on intelligence we receive and crime statistics and data."

Privacy, security, accuracy? Trust us, citizen

An NEC spokeswoman claimed, referring to internal studies, that her firm's Neoface product has a 70 per cent success rate, something we have asked for further clarity on. In a city of 8 million people such as London this means around 80,000 people – almost the full capacity of Wembley Stadium – could be at risk of arrest because the black box falsely said they're a suspect.

If your mug is not matched by the system, according to Ephgrave, the images taken and the processing done will be deleted immediately.

NEC's spokeswoman was at pains to stress that all data collected is held and processed on the Met's servers and not the private sector's, adding: "If you do generate an alert, those are retained for one calendar month and similar with the alerts that are sent to the mobile device [issued to each policeman on the deployment]. Retained just for the period of the deployment and then that's wiped. The watchlist is wiped from the system immediately after deployment."

Thoughtfully, Ephgrave said he wouldn't want to be part of a police force that stopped doing what it thought was in the best interests of the public just because it lacked support, adding:

Technology moves at a frightening pace. We all know that. But the point is this has to be a balance between our desire to fight crime and the public's desire to have a society where they're content with policing styles and tactics there to protect them.

This puts Met management at loggerheads with the force's trade union chief, who was merrily praising China's Stasi-on-steroids approach to AFR last year. One wonders whether the London Police Federation merely voiced what coppers truly think about modern surveillance technologies. ®

Sponsored: Ransomware has gone nuclear

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER


Biting the hand that feeds IT © 1998–2020