Let's check in with our friends in England and, oh good, bloke fined after hiding face from police mug-recog cam
Well, it is the nation that brought us Nineteen Eighty-Four
Video A man was pulled to one side, grilled, and fined by cops after he hid his face from a facial-recognition system being tested on the streets of south east England.
London's Metropolitan Police was at the time running public tests of AI-powered equipment that takes photos of people out and about in the capital, and runs the pics through an image database of Brits on a watch list looking for a match.
Specifically, the system was being tested in Romford, a town on the outskirts of East London, and not all of its residents were happy about it. A middle-aged bloke wearing a baseball cap pulled up his fleece to hide the bottom half of his face as he walked past the camera to avoid identification.
He was then stopped by officers, who believed he was acting suspiciously, and quizzed. He was fined £90 (~$115) for "disorderly behavior," because, well, it seems the plod couldn't nab him for anything else.
“I said, ‘I don’t want me face showing on anything’. If I want to cover me face, I’ll cover me face. It’s not for them to not tell me to cover me face,” the chap later recalled.
The row was filmed by journalists working for BBC Click, the Beeb's tech TV programme. You can watch the kerfuffle unfolding below, in footage released earlier this week.
Are you ready for a world of facial recognition? Several UK police forces have been trialling the technology. pic.twitter.com/4LFLLEzSQe— BBC Click (@BBCClick) May 13, 2019
The Metropolitan Police have been trialing facial-recognition systems for a while now. The technology was rolled out to monitor partygoers attending the Notting Hill Carnival in 2016 and 2017, as well as at the Port of Hull docks and Stratford Transport Hub. Last year, it was estimated the technology had a whopping 98 per cent false positive rate. In January this year, it emerged that the Met had blown more than £200,000 on facial-recognition trials with little or no arrests to show for it.
Privacy orgs such as Big Brother Watch, a British nonprofit, have urged the plod to stop using the technology. The Information Commissioner, UK’s data protection watchdog, has launched an investigation into how the police are using face-scanning and biometric systems.
Facial-recognition tech is a contentious issue. Experts have been critical of its inaccuracies and biases. We've asked the Metropolitan Police for comment.
Elsewhere, in San Francisco, politicians have taken the matter into their own hands. It has become the first major city in the US, if not the Western world, to impose stringent rules and regulations on how the technology can be used by cops and city government departments. Private companies and federal departments are not affected, however, so some security cameras and airport scanners can still use facial recognition. ®
Sponsored: What next after Netezza?