This article is more than 1 year old

Google brings its secret health data stockpiling systems to the US

Remember the UK DeepMind scandal? No?

Updated Google is at it again: storing and analyzing the health data of millions of patients without seeking their consent - and claiming it doesn’t need their consent either.

Following a controversial data-sharing project within the National Health Service (NHS) in the UK, the search engine giant has partnered with the second-largest health system in the United States, St Louis-based Ascension, to collect and analyze the health records of millions of patients.

According to a report in the Wall Street Journal, which claims to have seen confidential internal documents confirming the move, Google already has the personal health information of millions of Americans across 21 states in a database. The project is codenamed Project Nightingale and according to the WSJ, over 150 Google employees have access to the records of tens of millions of patients.

Neither patients nor doctors have been told about the project and have not given their consent to Google being given access to their health data. But Google is relying on a legal justification that says hospitals (under the Health Insurance Portability and Accountability Act of 1996) are allowed to share data without telling patients if that data is used to “only to help the covered entity carry out its health care functions.”

Google is using the data - which covers everything from lab results to doctor diagnoses to hospitalization records and connects it to patient names and their dates of birth - to develop new software that purports to use artificial intelligence and machine learning to provide valuable insights into health issues and even predict future health issues for individuals.

The whole approach may seem oddly familiar to Reg readers: we have extensively covered an almost identical scheme in the UK called DeepMind in which Google was found to be storing and analyzing data on over a million patients following a data-sharing agreement with the Royal Free Hospital.

Not this again

Neither the hospital nor Google sought or received permission from doctors or patients for the use of that personal data, sparking an investigation from the Information Commissioner’s Office (ICO) that found a host of problems with the scheme.

The Royal Free NHS Foundation Trust had failed to comply with the UK's Data Protection Act when it provided the 1.6 million patient details to Google's DeepMind, the ICO concluded. It also found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.

The hospital was told to establish a proper legal basis under the Data Protection Act for the project and for any future trials, and outline how it will comply with its duty of confidence to patients in any future trial involving personal data. It was also told to complete a privacy impact assessment and commission an audit of the trial.

That subsequent audit itself proved controversial when it argued that the sharing of personal health data without consent had not broken any laws - despite the ICO and the UK's Department of Health National Data Guardian concluding otherwise.

The report - commissioned by the trust - was limited in scope. It did not dig into Google initial data gathering but the current use of the “Streams” app that Google was developing. Most significantly, it concluded that the hospital had not breached its “duty of confidence” and justified that decision by claiming that the correct law to apply to the project was not data protection law but confidence law.

Under that law, the report argued, the data sharing was legally justified if its use did not “trouble a health professional's conscience.” In other words, the legality of gathering and analyzing personal health data went from objective - you cannot do this without consent - to subjective - does this trouble my conscience?

Strained consciences

Cash-strapped hospitals’ consciences are likely to be more flexible when approached by a company that turns over $137bn in annual revenues and $31bn in net income (PDF) and is determined to use its systems to break into the health market, as indicated earlier this month by its $2.1bn acquisition of wearables company Fitbit.

The DeepMind audit also contained a number of other questionable assumptions. The auditors accepted Google’s argument that it needed to use very large databases of real patient data for safety reasons, although it didn’t dig into the basis for that claim. It didn’t dig into why Google needed to store that data either, or why Google needed to retain data indefinitely - in this case, going back eight years - as opposed to, say, a 12-month cut-off and deletion of old data.

Google did not even have a formal deletion policy. In response, the auditors referred to clinicians who said this was useful for context. The audit also repeatedly made the argument that because the hospital’s systems had data going back many years that the data given to and stored by Google was mere duplication.

That argument was attacked by critics who pointed out that a hospital exists to provide care to patients and is paid to do that job, whereas Google’s entire business model is based on compiling data on people and then monetizing it by charging advertisers access to people who may be interested in their products.

"It's clinical care through a mass surveillance lens," noted Eerke Boiten, professor of cybersecurity at De Montfort University. But the hospital’s auditors didn’t think that Google’s business model was relevant.

"In conducting our review, we considered if we ought to treat DeepMind differently from the Royal Free's other information technology partners, such as Cerner," the report said. "We decided that this would not be appropriate. DeepMind acts only as the Royal Free's data processor... Given this limited mandate, we do not see why the Royal Free's engagement with DeepMind should be any different from its use of other technology partners."

Computer says whoah

There was also a technical assumption within the audit that raised eyebrows: it claimed that it was essential for Google to store all the data itself because the hospital’s IT systems wouldn’t be able to handle the load of Google’s database queries.

According to the audit “the technical barriers to move to a query-based model are insurmountable" - but there didn’t seem to have been any inquiry into the actual systems in place at the hospital and the auditors simply took their word for it.

Some other details about Google and DeepMind: it initially said that DeepMind operated independently and so the data was never going to make it way to Google’s larger database. But after having been cleared through the audit, Google then took over DeepMind entirely, subsuming it into its corporate umbrella - pulling it into its Google Health US arm, which is the same arm that has the data-sharing deal with Ascension exposed today.

And who could forget that when Google absorbed DeepMind Health, it disbanded the unit's "independent review panel."

With Google entering the larger US market and with access to tens of millions of patients’ records, the tech giant has decided that rather than independent boards, it will hire staff and give them the same oversight role.

We're hiring!

Last month, it hired Karen DeSalvo in the new role of chief health officer. DeSalvo was previous national coordinator for health IT under US president Barack Obama.

fitbit

Google forks out $2.1bn for Fitbit – and promises not to exploit all that delicious health data to sling ads (honest)

READ MORE

A few months earlier, Google hired former FDA Commissioner Robert Califf to look after policy and healthy strategies. And both of them will report to former hospital executive David Feinberg.

In September, Google signed a 10-year deal with another US health provider, Mayo Clinic, to store its genetic, medical and financial records. That deal purposefully left the door open to Google developing its own software as a result of the data access but Mayo said any personally identifiable data would be removed before it was shared.

This latest project - Project Nightingale - does not appear to have the same privacy-protecting constraints.

In response to our questions, Google direct us to a press release put out today by Ascension. Nothing in the press release undercuts the WSJ report that 150 Google employees have access to the personal health records of tens of millions of Americans, nor does it address the issue of consent, or the claim that the data is not anonymized.

Instead, it refers to the project as a “collaboration” and says the deal will “modernize” its systems by “transitioning to the secure, reliable and intelligent Google Cloud Platform.” It also says that the collaboration will be “exploring artificial intelligence/machine learning applications that will have the potential to support improvements in clinical quality and effectiveness, patient safety, and advocacy on behalf of vulnerable populations, as well as increase consumer and provider satisfaction.” ®

Updated to add

Google has now released a statement on the matter, stressing the main focus is on providing cloud services to Ascension, describing it as "a business arrangement to help a provider with the latest technology, similar to the work we do with dozens of other healthcare providers."

More about

TIP US OFF

Send us news


Other stories you might like