This article is more than 1 year old

Google to remove private medical data from search results

As DeepMind slurps up more more patient data for Streams

Google has decided to wipe people’s medical records from its search results - just as its AI branch DeepMind extends its grips on UK patients’ medical records.

The Chocolate Factory added an extra line to its removal policies page on June 23, which lists “confidential, personal medical records of private people” as information that Google might remove.

This is in addition to bank account numbers, credit card numbers, images of signatures and national identification numbers - as well as the more recently added policy to scrap “revenge porn” snaps if the people in the pictures request it.

The move comes as the company's AI arm, DeepMind, has announced that another hospital is giving it access to NHS data.

Taunton and Somerset NHS Foundation Trust has given the thumbs up to DeepMind, and will roll out the company’s Streams app - which aims to identify patients at risk of acute kidney injury - over the next five years at its Musgrove Park Hospital.

The hospital provides care to a population of 340,000, as well as some specialist services to the whole of Somerset, which gives it a catchment population of around 544,000.

The app isn’t currently an AI technology, using instead an algorithm that doctors and nurses will already use to establish if someone is at risk of kidney injury, but the company has been clear that it hopes to develop AI-powered algorithms for healthcare.

However, DeepMind’s somewhat gung-ho approach to patients' data security concerns has repeatedly come under fire.

DeepMind's initial partnership with the Royal Free Hospital in London was put on hold for a few months during 2016 after it was revealed that not all of the Free's 1.6 million patients were aware their data was being used.

Earlier this year, it was revealed that the UK’s national data guardian for health, Fiona Caldicott, had deemed the original use of those medical records legally “inappropriate”.

In a leaked letter to the medical director of the Royal Free, Caldicott said that the firm and the trust had said the basis for data sharing was implied consent, but that since the data was being used to test and develop the app this could not apply.

Caldicott said that patients would not have reasonably expected their records to be used for app testing, and that implied consent “is only an appropriate legal basis for the disclosure of identifiable data for the purposes of direct case if it aligns with people’s reasonable expectations”.

DeepMind has consistently said that patients’ data is safe, and will not be shared with its parent company, as well as emphasising the benefits Streams has for patients.

Dominic King, clinical lead at DeepMind Health, said: “Nurses and doctors already using Streams are telling us that it is helping them deliver faster and better care for their patients.”

Musgrove Park Hospital - the first outside London to agree to trial the Streams app - said that no data had yet been transferred to Streams, and that there would be a series of workshops and open days to sell the trust’s decision show how the app works. ®

More about

TIP US OFF

Send us news


Other stories you might like