This article is more than 1 year old

Criminal justice software code could send you to jail and there’s nothing you can do about it

Trade secrets are trumping personal liberty

DEF CON American police and the judiciary are increasingly relying on software to catch, prosecute and sentence criminal suspects, but the code is untested, unavailable to suspects' defense teams, and in some cases provably biased.

In a presentation at the DEF CON hacking conference in Las Vegas, delegates were given the example of the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) system, which is used by trial judges to decide sentencing times and parole guidelines.

"The company behind COMPAS acknowledges gender is a factor in its decision-making process and that, as men are more likely to be recidivists, so they are less likely to be recommended for probation," explained Jerome Greco, digital forensics staff attorney for the Legal Aid Society.

"Women [are] thus more likely to get probation, and there are higher sentences for men. We don’t know how the data is swaying it or how significant gender is. The company is hiding behind trade secrets legislation to stop the code being checked."

These so-called advanced systems are often trained on biased data sets, he said. Facial recognition software is often trained on data sets filled with predominantly white men, he said, making it less effective at correctly matching up people of color, according to research by academics.

facial

America's top maker of cop body cameras says facial-recog AI isn't safe

READ MORE

"Take predictive policing software, which is used to make decisions for law enforcement about where to patrol," Greco said. "If you use an algorithm based on data from decades of racist policing you get racist software. Police can say 'It's not my decision, the computer told me to do it,' and racism becomes a self-feeding circle."

It's not just manufacturers who are fighting disclosure around their crime-fighting tools – the police are, too. The use of Stingray devices, which mimic cellphone towers to track and potentially snoop on nearby devices, was kept quiet for years. The New York Police Department used such a device over 1,000 times between 2008 and 2015*, and never mentioned it, Greco said.

While the use of Stingray surveillance technology is now well established, cellular protocols have been upgraded with stronger security, making it much harder if not impossible for the fake cell towers to decrypt and analyze mobile messages and data streams over the air. The cops are also using passcode-cracking tools to get into locked mobile phones that haven't been assessed and which cannot be assessed because it is only ever sold to law enforcement, he claimed.

"Software needs an iterative process of debugging and improvement," said Dr Jeanna Matthews, associate professor and fellow of Data and Society at Clarkson University. "There’s a huge advantage to independent third party testing, and it needs teams incentivised to finding problems, not those with an incentive to say everything's fine." ®

* The statistic is backed by information obtained from the NYPD via a Freedom of Information Law request by the New York Civil Liberties Union.

More about

TIP US OFF

Send us news


Other stories you might like