This article is more than 1 year old

We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it

Long story short: Models are ineffective, racist, dumb...

Analysis AI experts, lawyers, and law enforcement urged US Congress to regulate the use of facial recognition technology during a hearing held by the House Committee on Oversight and Reform on Wednesday.

The technical issues and social impacts of using AI software to analyse images or videos are well known. There have been repeated reports of how inaccuracies lead to people being misidentified in research and in real life. San Francisco just passed an ordinance banning the local government using facial recognition technology.

In some cases, like the experiment conducted by the American Civil Liberties Union’s (ACLU), a nonprofit based in New York, that showed Amazon Rekognition incorrectly matched members of the US Congress to criminal mugshots, the effects have been minimal. It’s simply absurd for elected politicians to be wanted criminals. But what happens when the technology is turned on civilians who have less power?

At a hearing of the House Committee on Oversight and Reform on Wednesday, Joy Buolamwini, founder of Algorithmic Justice League, an activist collective focused on highlighting the shortcomings of facial recognition, found that commercial computer models struggled most when it came to recognizing women with darker skin. IBM’s system was incorrect for 34.7 per cent of the time when it came to identifying black women, she said.

Below is a video of the hearing, or read on for a summary...

Youtube Video

The problem boiled down to biased training datasets, Buolamwini told the House committee. AI systems perform worse on data that they haven’t seen before. So, if most datasets mainly represent white men then it's not surprising that they find it difficult when faced with an image of women of colour.

When it comes to databases of mugshots, however, the reverse is true. Black people are overrepresented in mugshot databases, explained Clare Garvie, Senior Associate at Georgetown University Law Center's Center on Privacy & Technology. If law enforcement are using these flawed models to target the group of people that it struggles to identify most then it will undoubtedly lead to police stopping and searching the wrong people. “It’s a violation of the first and fourth amendment,” Garvie said during the hearing.

Law enforcement and lack of transparency

Cedric Alexander, the former president of the National Organization of Black Law Enforcement Executives who was also a witness at the hearing, estimated that at least a quarter of law enforcement agencies across the US use facial recognition to some degree.

Police from Washington County and Orlando are an example of some bureaus that are using Rekognition. Michael Punke, Amazon’s VP of Global Public Policy, said at the time it has “not received a single report of misuse by law enforcement.” It’s difficult to verify that claim, however, considering that the police haven’t been transparent about how it’s used.

It’s all done in secrecy, according to testimony. Elijah Cummings, the chair of the Oversight Committee, said that 18 states had shared data like passport photos or driver licenses with the FBI without explicit consent. When the witnesses were pressed with questions on what kind of information law agencies share with one another, nobody knew.

Neema Guliani, senior legislative counsel for the ACLU, took a tough stance and called for a moratorium on the technology. She urged the committee to “take steps to halt the use of face recognition for law enforcement and immigration enforcement purposes until Congress passes a law dictating what, if any, uses are permissible and ensures that individuals’ rights can be protected.”

Unregulated use of the technology could also potentially lead to an “Orwellian surveillance state,” where citizens are constantly tracked Guliani said. In the opening statement, Cummings said there are about 50 million surveillance cameras in the US, and that half of all American adults are probably part of facial recognition databases and they don't even know it.

Andrew Ferguson, professor of law at the University of the District of Columbia, agreed that the Congress needed to act now to prohibit facial recognition until Congress establishes clear rules. “Unregulated facial recognition should not be allowed to continue unregulated. It is too chilling, too powerful. The fourth amendment won’t save us.

The Supreme Court is trying to make amendments but it’s not fast enough. Only legislation can react in real time to real time threats,” he warned.

Alexander was more cautious about a blanket ban on the technology, however. He believed that there were still ways that law enforcement could positively use facial recognition.

“There is a place for the technology, but the police need to be trained properly. They can’t just be passed the technology by software companies.” Effective policing is about building relationships in the local community, and it can’t afford the effects of misidentifying people. How can we utilise the technology, whilst developing some standards?, he asked.

Benchmark tests simply aren't good enough

The National Institute of Standards and Technology (NIST), a laboratory part of the US Department of Commerce, is currently conducting official benchmark tests for commercial facial recognition systems. But they need to be better, Buolamwini said. She brought up the issue of what she called “pale male datasets”.

“The gold standard benchmark dataset is biased and can lead to a false understanding of progress,” she said

Even if there was a facial recognition system with near-perfect accuracy in the testing phase, it doesn’t solve the problem that most data used by law enforcement is often grainy and low resolution. A recent report by Georgetown University found that in some cases police were even trying to match people by composite artist sketches.

“Faces maybe the final frontier of privacy,” Buolamwini said.

The hearing took place at the same time as Amazon shareholders tried to stop Rekognition being sold to law enforcement. The proposal was defeated, but the vote tallies were not immediately disclosed. ®

More about

TIP US OFF

Send us news


Other stories you might like