This article is more than 1 year old

Regulate This! Time to subject algorithms to our laws

A Minority Report future awaits

Opinion Algorithms are almost as pervasive in our lives as cars and the internet. And just as these modes and mediums are considered vital to our economy and society, and are therefore regulated, we must ask whether it's time to also regulate algorithms.

Let's accept that the rule of law is meant to provide solid ground upon which our society can function. Some laws stop us taking each other's stuff (property, liberty, lives) while others help us swap our stuff in a way that's fair to the parties involved (property, liberty, time).

The idea of regulating algorithms has gained traction even in Parliament – and not without cause. The idea behind such regulation is often very much in line with other laws: that without oversight and legal culpability they could be deleterious to the whole business we suffer through of living alongside each other, or swapping stuff.

Baron Timothy Clement-Jones in February said artificial intelligence algorithms required "huge consideration" of their "ethics".

Clement-Jones' fellow in the House of Lords, Baroness Byford, told the chamber: "According to a recent radio programme, algorithms are used to make individual decisions in the fields of employment, housing, health, justice, credit and insurance.

"I had heard that employers are increasingly studying social media to find out more about job applicants. I had not realised that an algorithm, programmed by an engineer, can, for example, take the decision to bin an application."

Such concerns are not unique to Parliament. Speaking to The Register in March, UCL's Dr Hannah Fry warned we needed to be wary of algorithms behind closed doors.

In the hands of a few programmers who have no accountability for the decisions that they're making

The issue, she noted, is that without access to seeing how such algorithms function "you can't argue against them" when they provide dodgy results.

"If their assumptions and biases aren't made open to scrutiny then you're putting a system in the hands of a few programmers who have no accountability for the decisions that they're making," Fry said.

She explained how algorithms about predicting re-offending rates for individuals in the US are being used in sentencing, where the analysis of such data has very serious consequences.

"An example I use in my talk is of a young man who was convicted of the statutory rape of a young girl – it was a consensual act, but still a statutory crime – and his data was put into this recidivism algorithm and that was used in his sentencing. Because he was so young and it was a sex crime, it judged him to have a higher rate of offending and so he got a custodial sentence," she said.

"But if he had been 36 instead of 19, he would have received a more lenient sentence, though by any reasonable metric, one might expect a 36-year-old to receive a more punitive sentence."

Legislative action has been suggested. Last year, Labour's industrial spokesperson and shadow minister, Chi Onwurah, told The Guardian in an interview that "algorithms aren't above the law" and that as "the outcomes of algorithms are regulated – the companies which use them have to meet employment law and competition law. The question is, how do we make that regulation effective when we can't see the algorithm?"

Meanwhile, the European Union's commissioner for competition, Margrethe Vestager, urged competition enforcers to keep an eye out for cartels that use software "to work more effectively" as cartels.

In a speech about algorithms and competition, she stated: "We're not yet dealing with an algorithm quite as smart as [Hitchiker's Guide to the Galaxy's] Deep Thought. But we do have computers that are more powerful than many of us could have imagined a few years ago. And clever algorithms put that power – quite literally – in our hands."

So what is regulation, and how do we do it?

The immediate answer to many of these concerns is to reveal biases in algorithms by opening them up to public scrutiny. This has been the most fundamental of all human political activities since the Enlightenment — to observe and to measure the expression of power in society.

And yet, if the last decades of open-source software have taught us anything, it is that simple availability does not incentivise investigation. Very old vulnerabilities are constantly being found in software which had certainly been in use long enough for such vulnerabilities to have been discovered earlier.

The House of Lords debate earlier this year centred around a proposed amendment to the Digital Economy Bill, which would have given Ofcom the power to "carry out and publish evaluations of algorithms," but unlike the strict definitions of data protection that allow the Information Commissioner's Office to enforce the Data Protection Act, there are rarely specifically defined aims and intentions for algorithms which their performance could be measured against.

The increasing popularity of machine learning algorithms will make this problem more apparent. When an organisation doesn't know what it wants from an algorithm, how can it measure what the results are? And how will unintended results be noticed and reported to the regulator?

One such method could be to require organisations using algorithms to retain records on all of the data they are using, and to reappraise previous findings whenever updates are imparted. This would be expensive in the first place, and the results of reappraisal could be extreme.

Who would be liable for a man wrongfully given a harsh prison sentence, or a family denied a mortgage when house prices were affordable for them?

Your suggestions are welcome. ®

More about

TIP US OFF

Send us news


Other stories you might like