Peers told to push for cut-price access to med tech developed with NHS data
But for Pete's sake, don’t recommend another new body
The UK's health service should get cheaper rates on healthcare products developed using NHS data, peers have been told.
In the latest installation of the House of Lords Artificial Intelligence Committee's eponymous inquiry, peers quizzed experts on whether the NHS risked under-valuing - or failing to realise the value of - its data.
The witnesses broadly agreed that, if NHS data had been used successfully to train an algorithm, the health service should benefit in some way, and that this should be in the form of returns rather than an upfront charge.
Martin Severs, medical director of NHS Digital, said, given that AI can't exist without large amounts of data - and in healthcare this would be the public's data - it was right that there should be some sort of return on that investment.
But this needed to be done in a "sensible way", he said: in his opinion, charging for data or trying to enact intellectual property rights is not the right approach. Instead, there should be reduced rates for the NHS to use the tech.
"If the public saw that their data was used to develop an AI product, and that AI product was discounted to the NHS, I believe… most of society would think that was a fair deal."
Hugh Harvey, a clinical AI researcher at Guy's and St Thomas' Hospital in London, agreed that there shouldn't be a monetary barrier to entry, adding that innovation benefitted from people being able to "fail at low cost".
However, he added that the data could be worth "billions" to the public purse and said it "would be a real shame for the UK economy, especially in times of strain on the funding of the NHS, not to be able to leverage the value of that data".
Their comments echoed those made in an earlier session. Nicola Perrin, who leads the Understanding Patient Data scheme at the Wellcome Trust, said giving the NHS a reduced rate would go some way to building public confidence, but noted that there would be very different implications at the local and national level.
Former Lib Dem MP Julian Huppert, who chairs the independent review panel for DeepMind Health - instigated after the firm's iffy data deal with the Royal Free - agreed the NHS should "absolutely" get some sort of return, but said the main question was in "what format" this should arrive.
For instance, the hospitals working with DeepMind on its Streams app – which is not an AI program, but rather a fixed algorithm – will get five years of free access to the tool, but also benefit from the firm's expertise in data management.
However, Sobia Raza, head of science at the PHG Foundation, pointed out that, although the NHS has the "essential ingredients" - ie, masses of data - it "doesn't have the compute and machine learning expertise" that the companies do.
"Development of algorithms is a collaborative effort," she said, adding that there needed to be incentives, benefits and fairness on both sides.
The witnesses also alluded to the power of the technology monopolies, with Huppert noting that companies that couldn't gain access to data might train their algorithms on different populations and then charge the NHS full price.
But Harvey later argued that, because of differences between populations, if algorithms were to work effectively in the UK, they would need to be trained on NHS data at some point anyway "so we might as well take the lead".
'Don't recommend another Data Body'
Witnesses at both sessions also voiced concerns about the confusing regulatory landscape in the UK, calling for greater clarity rather than more additions to the acronym soup.
Fiona Caldicott, the national data guardian, said that she wouldn't make the case for another system as there were already "a plethora of bodies". Another body for AI might be needed in future, "but in my view, we have as much as is appropriate for the pace at which this is going".
Perrin made this point more strongly, warning that there was a "proliferation" of regulatory bodies and oversight groups, both in existence and in recommendations from various other reviews, which include a data stewardship body, a council on data ethics and a convention on data ethics.
“I’m not entirely clear how all those different proposals fit together, or even if one needs all of them,” she said.
“So if you could give clarity over what’s not needed, rather than suggesting yet another new body, it would be very helpful.”
Sponsored: Becoming a Pragmatic Security Leader