This article is more than 1 year old

Algorithms, Henry VIII powers, dodgy 1-man-firms: Reg strokes claw over Data Protection Bill

Handy guide to best and worst amendments tabled for new law

The House of Lords will today start poring over the UK’s Data Protection Bill, line by line, as it enters committee stage.

The peers have to agree to every one of the 194 clauses in the bill and debate 32 pages' worth of amendments, so it's no surprise this stage can often take more than seven days to complete.

The bill, introduced last month, aims to put the EU General Data Protection Regulation into UK legislation.

And, although some parts of the bill should be a straightforward lift-and-shift of the rules, there are additional sections for the exemptions member states are allowed to define themselves, while other sections focus on national security and intelligence.

The result is a complex, often confusing, piece of legislation that Baroness Lane-Fox of Soho described as being "incredibly hard to read and even harder to understand".

When it was introduced, the focus was on the creation of new criminal offences in the UK, including for re-identification of de-identified personal data.

Aside from the fact the language used doesn't map well onto more commonplace terms of pseudonymised and anonymised data, this section raised eyebrows because it failed to offer an explicit exemption for security researchers.

Alan Woodward, a security researcher at the University of Surrey, told The Reg at the time that this could catch people out, and likened it to laws that make reverse-engineering of software products illegal.

Calls for an amendment to explicitly exempt them have not been granted, but two peers – Lord McNally and Lord Clement-Jones – have said they intend to oppose that clause remaining in the bill, so it should at least get the discussion going.

However, since its introduction, a more significant cause for concern has been spotted in the bill text that has caused an outcry from privacy and civil liberty campaigners, including MedConfidential.

This is a broad exemption that would remove a person's rights as a data subject – their ability to access information or ask how it is being used – if satisfying them would prejudice "effective immigration control", but this is not specifically defined.

This is concerning because it isn't clear what the government wants to use this exemption for. It's not in either of the two preceding Data Protection Acts - which have been around for the past 35 years - and there are already exemptions for cases of crime, national security, public safety and protection of sources in both the GDPR and elsewhere in domestic law.

As Liberty said, the clause could "strip migrants of the right to have their personal information processed lawfully, fairly and transparently when it is being processed for immigration control purposes, regardless of their immigration status".

Data protection expert Chris Pounder noted that the clause could prevent asylum seekers gaining the information they need to appeal a Home Office decision on whether they have the right to remain - 13 per cent of such appeals are successful, he said in a blogpost.

Given the existing exemptions available to the government, he said there is a "distinct possibility" that the powers granted here could "become an administrative device to disadvantage data subjects using the immigration appeals process".

Four peers have lodged an amendment that would scrap this paragraph.

Statutory what now?

Meanwhile, the government is once again being accused of giving itself too much future power, with Henry VIII and delegated powers that effectively allow it to amend the primary legislation without asking parliament.

They're an increasingly common feature in legislation as government tries to future-proof laws as it tries to deal with Brexit, and this bill is littered with them, for instance to add new bases for processing sensitive personal data and exemptions for classes of data controllers.

The House of Lords Constitution Committee's report on the bill pulls the government up on this, saying: "We draw attention to the number and breadth of the delegated powers in this Bill. This is an increasingly common feature of legislation, which, as we have repeatedly stated, causes considerable concern."

Small companies can handle lots of data

When it comes to the amendments, the most damaging is broadly agreed to be the one that says "this Act does not apply to any organisation employing five employees or fewer", tabled by Lord Arbuthnot and Baronness Neville-Rolfe – a former digital minister who should almost certainly know better.

"If the intention of the amendment about small business really is to exclude organisations with less than five employees from data protection altogether, I think it is the single most harmful thing a person could suggest. I’m bloody furious about it," said data protection consultant Tim Turner.

"A company with two or three employees can process data about millions of people. There are many examples of this - The Consulting Association was an organisation that ran harmful blacklists about construction workers for years. It had one employee."

Jon Baines, chair of NADPO - the National Association of Data Protection and FOI Officers - added that such an exclusion would be "open to abuse by miscreant companies that structure themselves so as to avoid being subject to the Act".

Archer cracks the ISIS mainframe's password

UK Data Protection Bill lands: Oh dear, security researchers – where's your exemption?

READ MORE

There’s also an amendment that would change the status of colleges, schools and universities, meaning they weren’t classed as public authorities. The effect? That they wouldn’t be required to appoint a data protection officer and would probably be able to process data based on their legitimate interests, which they can’t do at the moment.

Baines said he didn’t "see the case" for such an exemption, but Turner said it could help universities carry out some of their core functions.

"The fact that public authorities can’t use legitimate interests is a real problem for things like fundraising and alumni work," Turner said. But, he added it would be "foolish" to exempt them from data protection officers as they process data from large numbers of people for a huge range of purposes.

Overall, he said: "It’s a blunt instrument, but it is in response to a genuine problem. I think with some extra detail, it would be reasonable to explore these implications. I’d certainly support universities not being public authorities but still being required to appoint DPOs."

But some amendments have garnered more support. Three peers tabled one that would adopt an article of the GDPR - which is not required to be transmitted into member state's law - into the Data Protection Bill, which allows certain not-for-profit bodies to complain to a regulator without an individual instructing them to.

"This would improve consumer rights in two ways," the Open Rights Group said. "Firstly, it will protect the most vulnerable members of society such as children and the elderly. Secondly, it will move data protection to the same status as other consumer rights frameworks like competition or finance."

Computer says no?

The Liberal Democrats, meanwhile, are making an effort to tighten up rules on automated decision-making and the right for people to understand how machines are affecting their lives.

This includes a new clause that would grant someone the right to information about individual decisions made by public bodies based on algorithmic profiling.

Moreover, this would extend to both public authorities and contractors performing government functions - this last is one to watch, given that the Information Commissioner’s Office has been pushing for contractors to be subject to Freedom of Information laws.

There are also crucial efforts to make this a right people can actually make user of, as existing rules in this vein are rarely used and have come under fire from academics.

For instance, at the moment, someone would have to demonstrate that there was a "significant effect" on them as an individual - tough for situations like racist advertising that is more likely to damage a group than a single person. An amendment aims to add a line to say information should be handed over for "a group sharing a protected characteristic…to which the data subject belongs".

There is also an amendment that says that a decision is "based solely on automated processing" if there is "no meaningful input by a natural person" - a definition that has been lacking in previous rules.

Many of these amendments are based on research carried out by Lilian Edwards and Michael Veale of University College London.

"High-stakes public sector decisions really changes people's lives, and we have to make sure they are well investigated," Veale told The Register.

"Explanations can be useful, but no one has the energy to investigate every aspect of their life — we've seen online that over-burdening the user with choices is often overwhelming and unhelpful." ®

More about

TIP US OFF

Send us news


Other stories you might like