This article is more than 1 year old

UK.gov's use of black box algorithms to decide stuff needs watching

PS: Don't forget to try to cash in on public data – MPs

Increased use of algorithms in decision-making risks disproportionately affecting certain groups, MPs have said, urging the government to boost transparency and tackle bias - but not forget the value of public data.

In a report on algorithmic decision-making, published today, the House of Commons Science and Technology Committee said the tech has the potential to improve public services and drive wider innovation in a range of sectors, from transport to healthcare.

It said the data held by public authorities could be a potentially lucrative asset, urging the government to ensure it negotiates good deals for its use by private companies.

However, the MPs also sounded a strong note of caution - emphasising that algorithms should not be seen as a magic formula; rather they have the potential to be biased and need to operate in a strong regulatory and ethical framework.

The report said:

Algorithms, in looking for and exploiting data patterns, can sometimes produce flawed or biased "decisions" — just as human decision-making is often an inexact endeavour.

As a result, the algorithmic decision may disproportionately discriminate against certain groups, and are as unacceptable as any existing "human" discrimination.

To tackle this, the MPs called on the government to use its planned Centre for Data Ethics and Innovation - an advisory body that was promised in last year’s budget - to examine biases in algorithms.

The legislation requires that the use of personal data must be fair - which brings with it requirements on transparency and effects, putting pressure on the developers of algorithms to be able to "explain" the way the complex tech works...

For instance, it should look at how to improve the data on which the algorithms are based, and work to ensure that both the data and the teams developing the algorithms are diverse and representative of society.

Another strand of the report considered transparency in algorithms - a much-debated topic that is set to face fresh scrutiny once the European Union’s General Data Protection Regulation comes into force on Friday.

This legislation requires that the use of personal data must be fair - which brings with it requirements on transparency and effects, putting pressure on the developers of algorithms to be able to "explain" the way the complex tech works.

Although there is still some debate over how the provisions will play out in practice, the committee recommended that the ethics centre evaluates various accountability tools available, such as principles and codes, along with audits of algorithms and certifications of algorithm developers.

The report added that, if "disclosure of the inner workings of privately-developed public-service algorithms would present their developers with commercial or personal-data confidentiality issues", the government should establish ways to share the data in a "suitably desensitised format".

One suggested method was through the use of "data trusts" - an idea mooted by computer scientist Wendy Hall and BenevolentTech CEO Jérôme Pesenti in a review into AI commissioned by the government last year.

The idea is that these trusts will allow organisations holding data to share it in a controlled and standardised way with those that want to use it.

However, the MPs stressed that, although there may be practical difficulties in issuing understanable explanations - an idea emphasised time and again by the big tech firms during the inquiry - “the government’s default position should be that explanations of the way algorithms work should be published when the algorithms in question affect the rights and liberties of individuals”.

We smell money

At the same time, the committee has obviously cottoned on to the idea that data is an asset, and urges the government to use the data it holds for its own benefit - pointing to the value of those in the NHS, which the prime minister also noted earlier this week.

One way of doing this, the MPs said, is for the government to “negotiate for the improved public service delivery it seeks from the arrangements and for transparency, and not simply accept what the developers offer in return for data access”.

Big biz: Algorithms are too complicated, but also too easy to game, to open the black box

READ MORE

They also call for a special procurement model for algorithms that are developed with private sector partners - saying that this should be in formed by a review commissioned by the CCS - and done with some urgency, as deals are already being struck.

"The Government must urgently produce a model that demonstrates how public data can be responsibly used by the private sector, to benefit public services such as the NHS," said committee chair, Norman Lamb.

"Only then will we benefit from the enormous value of our health data. Deals are already being struck without the required partnership models we need."

Hetan Shah, chief exec of the Royal Statistical Society – who gave evidence to the committee during the inquiry – welcomed this section in particular, saying the public sector needed to ensure it didn’t repeat the mistakes of the deal between DeepMind and the Royal Free.

The committee also called for the government to produce a public list of where algorithms with "significant impacts" are being used or planned in central government, as a way of encouraging the private sector to get involved and boosting transparency.

In addition, there should be a ministerial champion for oversight of algorithms used by the public sector and coordinate departments’ approaches to the use of them, along with private sector partnerships.

Elsewhere, the report also gives a nod to the importance of international collaboration, continued funding for the Information Commissioner’s Office, and backs the publication of the data protection impact assessments required under the GDPR. ®

More about

TIP US OFF

Send us news


Other stories you might like