Surveying anonymity and the public good
Our survey shows researchers and the public disagree
Comment Members of the public are wary of having their data used – even anonymously – for research purposes, whilst researchers are altogether more laid back about the proposition.
That is one key conclusion of a Department of Health consultation on Additional Uses of Patient Data (pdf), published on 1 December, which found that "about half of the general public (53%) and patients (46%) thought that identifiable data should never be used without consent while only about one in ten researchers (11%) thought this".
It further reported: "More than half of the researchers (54%) thought that patient identifiable data should sometimes be used without patient consent as long as there was review by a group such as PIAG. Lower proportions of patients (30%) and the general public (30%) agreed".
This difference may reflect little more than a long-running debate between researchers and researched. However, it may also illustrate a number of issues that are likely to figure in debate around public policy and data over the next twelve months, including the over-enthusiasm of the well-intentioned, and a growing rift between Labour and Tories on how central government should use data.
Over the last few months, we have investigated instances where individual data has been collected for research purposes without clear explanation of the purposes to which it would be put, or even any positive effort to obtain permission from the data subjects. This happened in Lincolnshire, when the Local Community Health Services (LCHS) requested intimate details of children’s behaviour and wellbeing from parents.
Public outcry followed: but when last we spoke with LCHS, little had changed. They justified the survey as it supported programmes designed to do good: they showed little empathy for individuals who might just not wish to have their data collected.
They also run a relatively intimate "Lifestyle Behaviour Review" (pdf) of year 8 pupils. Whilst the LCHS claim this is "anonymous", they have dragged their heels in response to a question about whether their survey is genuinely "anonymous" in terms of the Data Protection Act. As Data Controllers are aware, the scope of the DPA is in practice synonymous with "identifiable" data: if data items can be combined in such a way that a specific individual is identifiable, there is no need for name or address to be present for the law to apply.
The Office of National Statistics are acutely aware of this issue, when publishing small area statistics. To prevent this, they use "record swapping", where "a small sample of records are swapped with a similar record in another geographical area". They also require that "the average cell size must be greater than or equal to one" and adjust small counts appearing in any table cell.
So is the Lifestyle Review – a document that quizzes young people about their drink, drugs and sex habits – genuinely unidentifiable? Given that it carries postcode and must identify age to within a year or so, it is hard to share the LCHS conclusion that it is.
Human Rights are non TRANSFERABLE
Privacy is a human right and human rights are not transferable.
So YOU cannot decide that *I* waive my right to 'no punishment without judicial process' and YOU cannot waive MY right on MY behalf not to be tortured, and YOU cannot waive MY right to privacy and so on.
They may think its best based on some vague concept of 'general medical good', but then again, everyone always does until their own privacy is violated.
Once they realize the privacy violation DIRECTLY AFFECTS THEMSELVES then suddenly they start screaming stalkers, or like MPs, they're blacking out their expenses claims, their number plates, their homes from Google Street View, taking their childrens name from Contactpoint, and so on.
For example, CCTV, ask people if they support CCTV to prevent crime, they say yes. Then their neighbours put a CCTV pointing to their gardens to record noisey behaviour and they're outraged! The penny doesn't drop till they see how invasive it is to THEM personally.
I also note it's the lack of empathy. If I am a researcher checking 'Human Fin Rot' and I think I'm never going to get 'Human Fin Rot' then I don't empathize and hence don't feel the need to protect THEIR/My privacy, because I don't see it as mine.
On the other hands, if I had AIDS and was researching AIDS I would not in a million years think it was OK to start revealing who has AIDS for some greater good.
And I notice that the more broken a society is, the less empathy with their fellow man, and the more likely they are to approve invasive intrusive privacy violations.... just as long as in their minds, it's EVERYONE ELSE that is the subject.
So to me, the less a society understands the fundamental rights, the more broken they are, the more fragmented, the more incohesive a society is.
"a mindset on the part of some officials...
"... that if the end is for the public good, then it doesn’t matter if the rules around data collection get slightly bent."
Hmm, "it's for your own good", so don't you worry your pretty little heads about it, you can trust us, we're looking after you, just enjoy the Bread and Circuses, erm, I mean go and watch X-Factor and Strictly Come Dancing..."
Oh and as for "That would appear to be joined, at the lower levels of government, by a poor understanding of the letter of the law when it comes to Data Protection", and at the higher levels of government by a complete lack of understanding (or they just don't care) that the public don't trust them with our data...!
So much of this is really about trust. A couple of decades ago, if you'd have been given a survey and asked to fill it in by the Doctor, you'd probably have done it. The Doctor, in most peoples experience was someone they could trust, and if no name was asked, what was the problem. Even private enterprise was more trusted; a company asking your opinion of their service probably wouldn't even ask for name, postcode etc.
Wind the clock forward and it's all changed; virtually no-one is trusted, for the perfectly good reason that 8 times out of 10 that trust is in some way abused. Whether it's some health authority knocking up behaviour profiles on the quiet, or a business selling on your details of behaviour to marketers. The greedy and arrogant just can't keep their fingers out of the pie.
Thanks in part to phorm and the NHS (sounds odd in one sentence!), I now object to pretty much every use of my data or behaviour that isn't explicitly laid out for me to consider and give my assent to. Most I'll never know about, and that really does piss me off.
Till we get an enforceable, tough policy for data use that returns control to us, I'll object to every use, good or bad, and actively try to avoid them. If we can't be trusted to make our own informed decisions, why should we trust those of others?