Pssst... wanna participate in a Google DeepMind AI pilot? Be careful
Lessons from the NHS's 1.6 million data-records shovel
Imagine you’re in charge of technology and data for part of the UK’s chronically cash-squeezed National Health Service. A world-famous technology firm offers you a cool new service, either free or for very little money. All it wants in return is access to the patient data that will make the service work. What are you going to do?
After the Royal Free NHS Foundation Trust’s reprimand for passing personal data on 1.6 million patients to Google’s UK-based artificial intelligence unit DeepMind, the answer should be learned from what the north London trust and the company got wrong before proceeding.
Medical research on de-personalised data has been critical to health breakthroughs for decades
Information commissioner Elizabeth Denham did not impose a fine on the Royal Free, which is responsible for its patients’ data, despite her office deciding that it failed to comply with the Data Protection Act when testing DeepMind’s Streams app. Meanwhile, Dame Fiona Caldicott, the national data guardian for health and social care in England, came to the same conclusion, stating that Royal Free had not "used an appropriate legal basis for the initial data sharing". The Information Commissioner’s Office can charge organisations up to £500,000 and it fined private healthcare provider HCA International £200,000 for failing to secure personal data on fertility patients.
But Denham took a nuanced approach, welcoming the fact that the trial was positive and that the Royal Free now uses Streams routinely for tracking risks from acute kidney injury. “What stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable,” she wrote. “The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights.”
In its enforcement notice [PDF], the ICO noted that the trust only gave DeepMind access to personal data for clinical safety testing of Streams, as the Royal Free felt this was the only way to ensure it was safe. The trust then failed to adequately inform patients and questions whether it needed to pass on around 1.6 million partial records for the clinical safety testing. It told the trust to commission a range of assessments and audits on its work with DeepMind. Royal Free says it has already made progress, and is doing much more to inform patients about how their data is used.
All in all, Denham went out of her way not to discourage such work in future, focusing her criticism on specific avoidable problems. This must be a relief to the trusts which are already following Royal Free in adopting Streams, including Taunton and Somerset NHS Foundation Trust, one of the few parts of the NHS to rely heavily on open-source software, whose clinical staff will use the app to access the results of X-rays, scans and blood tests.
DeepMind is working hard to exculpate itself, including by publishing its contracts for use of Streams with Royal Free, Taunton and Somerset and Imperial College Healthcare NHS Trust. “Prior to the outcome of the ICO undertaking, we’d taken steps to become one of the most transparent companies working in NHS IT, appointing a panel of independent reviewers who scrutinise our work, embarking on a patient involvement programme, proactively publishing our NHS contracts, and building tools to enable better audits of how data is used to support care,” says Dr Dominic King, clinical lead for DeepMind Health, in a written response to questions from The Register.
The independent panel has already told DeepMind to do more engagement work with clinicians, academics, healthcare bodies and the public. The company is running consultation events and the deal with Taunton and Somerset includes a commitment to supporting public engagement activity before patient data is transferred.
“We’re committed to doing more to make sure the public are fully aware of exactly what DeepMind Health is working on,” says King. “There’s a fine line between finding exciting new ways to improve care, and moving ahead of patients’ expectations. We are sorry that we fell short when our work in health initially began, and we’ll keep listening and learning about how to get better at this.”
This cuts little ice with privacy campaigners, who don’t feel minded to forgive DeepMind and Royal Free for breaching data protection law. “This was an experiment on human beings,” says Sam Smith, co-ordinator at medConfidential. “The way they did it was so catastrophically stupid.”
Smith says that such work should go through review processes akin to those for drug trials – which would probably have rejected the handover of 1.6 million patient records. “If this was a new drug, the approach Google took was to synthesise a new molecule and inject it into a random bunch of people who walked into Accident and Emergency and see what happens,” he says.
“Medical research on de-personalised data has been critical to health breakthroughs for decades,” responds DeepMind’s King. “This type of research does not ordinarily require individual consent, but it still needs to go through a rigorous medical research approvals process. Where research projects do use identifiable data, which is the case in many drug research trials, then informed consent is usually necessary.”
Campaigners have other lines of criticism. In the publicly released agreements between Royal Free and DeepMind, the costs are blanked out, but a Freedom of Information request by Business Insider revealed that Royal Free, Taunton & Somerset and Imperial are enjoying free use of Streams unless their support costs exceed £15,000 in a single month.
medConfidential’s Smith compares this apparent generosity to a children’s hospital encouraging adults to fundraise for it by skydiving. This might be good for the children’s hospital, but not for the NHS as a whole: a 1998 study covering 174 parachute injuries treated by the NHS, 94 per cent of them being first-time charity parachutists, calculated that for each pound raised for the health service it spent £13.75 on treatment.
“Streams is an entirely new clinical app, and we don’t believe it’s right to charge the NHS anything other than modest service fees until it shows sustained impact and value,” DeepMind responds. “Once the sustained benefits of Streams are proven then we’ll aim to charge future partners fees in line with current IT supplier market rates, ideally tying some of these fees to the practical impact we can have on patients, clinicians and the hospitals we serve.”
DeepMind 'should not be shunned because of its owner'
The Royal Free and DeepMind got it badly wrong, but appear to have changed their ways and have certainly opened up about what they are doing. Furthermore, it’s not uncommon for customers that test-drive an early version of a product to get it for free or a reduced price. There is, however, a final criticism: DeepMind’s ownership by Google (or more precisely Alphabet, Google’s parent). Google’s business model involves monetising people’s personal data in return for free services, and it has recently been fined €2.42bn (£2.17bn) by the European Commission for abuse of its market dominance. Allowing a unit of that company access to patient data – particularly when that unit focuses on artificial intelligence, despite Streams not using such technology – adds to anxieties.
medConfidential’s Smith says that DeepMind should not be shunned because of its owner, but adds that Google’s ownership is a factor as NHS trusts are more likely to work with a famous company than a Shoreditch start-up. He adds: “They are held to a higher standard, but they want to be held to a higher standard.”
“We welcome being held to the highest standards,” says King. “By proactively publishing details of our NHS contracts and by appointing an independent panel to oversee and evaluate our work with the health service, we have actively encouraged scrutiny.”
So what can NHS organisations learn from this? “You need full transparency on what kind of information’s made available and whether it is in a fully anonymised form,” says Dr Edward Hockings, director of campaigner EthicsandGenetics.org – although it’s nearly impossible to do the latter, he warns, with proper public consultations in advance of data-sharing.
One option is not to share actual patient data, the approach taken by Genomics England, the Department of Health-owned organisation which runs the 100,000 Genomes research project. “An individual’s data is not released to researchers,” says chief information officer Peter Counter. “Instead, de-identified data is analysed within a secure, monitored environment – our ‘data embassy’.
"All our participants consent to have their genome sequenced and to have this linked with their medical records, but the identifiers used for linking are not available to researchers. Identifiable data is only used when it is being returned to a person’s own clinician to support their treatment.”
All participants volunteer to take part, can withdraw at any time and are represented on the access review committee which researchers have to get their projects past, he adds: “The overall results of analysis and research work can be exported, but no individual genomic or clinical data can be taken away.”
Hockings, who has argued that Genomics England doesn’t do enough to anonymise patient data, has recently criticised it for meeting with Google to discuss “using Google’s DeepMind among other subjects” to analyse genomic data.
“The government is misleading the public about data-sharing initiatives,” he says, by promoting commercialisation over privacy. For the NHS tech heads trying to decide whether to participate in such work, avoiding any private-sector involvement in the analysis of patient data looks unrealistic given it’s the location for much expertise. What is clear is that transparency, consultation, minimising third-party access to patient data – and not breaking the Data Protection Act – are essential. ®
Sponsored: What next after Netezza?