Can you trust an AI data trust not to slurp your data?
Data ownership is so yesterday. Give us all you've got... ethically
Comment In a refreshing change, the British government yesterday appointed an NHS technology advisory panel with almost no medics or tech experts on board. Today, it announced the names of expert advisors to the nebulous "Centre for Data Ethics and Innovation", one of two new bodies set up this year. This one is intended to assure the public that they can trust AI companies with slurping their data.
The centre was one recommendation from a study which formed the basis of the "AI Sector Deal". The centre's job is to devise another recommendation: "Data Trusts", or in the report's own words:
Mechanisms where parties have defined rights and responsibilities with respect to shared data – in order to protect sensitive data, facilitate access to data, and ensure accountability. This will allow and ensure fair and equitable data sharing between organisations in the private sector, and between the private and public sectors.
The public must trust the data flows from individuals to the crunchers, the Department for Digital, Culture, Media & Sport said today. We'll return to that relationship shortly. Let's turn to the Ministry of Fun to see who's on board of this one.
Guardians of the hen house
Chairing the AI data advisory council is Roger Taylor, co-founder of health data processing company Dr Foster (Taylor also sits on the NHS panel announced yesterday). Also serving is Susan Liataud, a corporate advisor who is vice chair of the LSE's governorship body, teaches at Stanford, and already advises government on how to appoint business advisors.
Oxford philosopher Professor Luciano Floridi, whose work is funded by Google and Microsoft, is another appointee. In 2014, Floridi was a member of Google's Right To Be Forgotten roadshow, where he called for the law to be reviewed.
We also find Richard Sargeant, the civil servant who contributed to the Google-friendly Gower's Review of intellectual property in 2006, before joining Google's public policy team. That was the review driven by an assertion apparently made by Google - that the founders of Google believed they could "never have started their company in Britain" - and echoed by the-then Prime Minister, David Cameron, that nobody could ever source. Sargeant returned to government in a senior position at GDS.
Ex-super-spad* Dan Korski, former deputy director of policy at No.10, is also on board. During the Cameron years, Korski championed Tech City and digital "disruptors", going in to bat for Uber against TfL.
Then there's Baroness Rock, who called for greater public spending on AI, and Kriti Sharma, perplexingly described as "one of the last chatbot executives in the financial industry". Sharma's Sagebot meant her company no longer needed to describe itself as a stuffy accounting firm, but a bleeding-edge AI vanguardista. Input can also be expected from Professor Robert Winston and the Rt Rev Steven Croft, the Bishop of Oxford.
Perhaps the preponderance of industry-friendly data-crunchers isn't surprising. The review that formed the AI Sector Deal was co-led by Facebook VP Jerome Pesenti. Facebook is a consumer data processing and advertising company, notorious for its promiscuity with personal data. Indeed, just days after founder Mark Zuckerberg appeared before the Senate to ask how a shady political consultancy obtained the personal data of 87 million Americans, Pesenti was defining what the government should do with your data. Without any omissions, these became the "AI Sector Deal".
You and your antiquated data laws – it's all ethics now
The trusts curiously talk about responsibilities on both sides, but this isn't how the law today regards personal data. One side does indeed have duties and responsibilities – and that's the side that collects, processes and stores personal data. In a free society, the individual, who owns the personal data craved by AI companies, does not have duties and responsibilities. There is no duty to "share", except in only one instance that we can think of.
In Dave Eggers' satirical sci-fi dystopia The Circle, individuals are shamed into contributing personal information: "Sharing is caring." "Privacy is theft." "Secrets are lies." Rather than asserting individual ownership over data, the council takes a step closer to The Circle's view of data. Life is beginning to imitate satire.
This is not fanciful. Floridi believes we are in a kind of "post-data-ownership" society where "ethics" supersede data protection. If large data processors like Google and Facebook help write the "ethics", they are writing the law that governs themselves. Hence the enthusiasm for "ethics committees" everywhere right now.
You may recall that the Ministry of Fun is also responsible for appointing the membership of the other new body recommended by the AI review. As we reported in June, it appointed DeepMind's Demis Hassabis to advise it.
Last week Google scrapped its DeepMind ethics board and took control of the AI outfit's health operation, falsely asserting to the press that "Trusts own their data". As the sign above the health clinic reminds Mae, the protagonist in The Circle: "To heal we must know. To know we must share." ®
* special advisor