Policy told data-sharing plans need vendor buy-in

Think tank calls for open standards, interoperability

By Rebecca Hill


Government departments should mandate interoperability when procuring systems and establish audit trails to track data use in order to benefit from data sharing, a think tank has said.

Whitehall has long struggled to make the most of the masses of data it holds, largely due to poor quality data and legacy systems that don’t talk to each other, as well as public unease.

In a report (PDF) published this week, public sector think tank Reform set out 13 ways the government could change its fortunes – taking the position that data-sharing will benefit the public sector. still drowning in legacy tech because no one's boarding Blighty's £700m data centre Ark


These focus on improving interoperability, giving public bodies the chance to test out data-sharing, training civil servants and creating data standards.

The government has long championed data-sharing as a way of cutting costs, in its ongoing search for “efficiencies”, as well as improving public services – but efforts have been hampered by a litany of ethical, legal and technical issues.

These include a lack of standard data formats and data infrastructure across government, while problems with legacy systems and poor interoperability are exacerbated by the number of vendors involved in public service delivery.

Added to this is the continued problem of siloed working in the public sector – especially when so many datasets are held by local government – and the need for coherent leadership within Whitehall.

To address interoperability problems, Reform noted that APIs can be laid on top of legacy systems. Vendors should be required to ensure their products employ open standards to allow providers to be swapped more easily.

It said that in the longer term, this would encourage competition in the public sector and tackle vendor lock-in.

Public still not loving the data-sharing...

Meanwhile, the government faces widespread scepticism and public distrust over its handling of personal data, thanks to botched projects like, secrecy about more controversial data-sharing schemes, and security blunders.

According to Reform, just 9 per cent of people it surveyed felt the government has their best interests at heart when data sharing, and only 15 per cent are confident of authorities' ability to deal with cyber drive-bys.

It said that public engagement, clearer explanations of when and how data is shared and used, and ensuring data use is properly audited would help build confidence. In fact all departments are told to establish audit trails tracking how data is used.

This will be particularly important as people become more aware of their data rights as a result of the media frenzy around the General Data Protection Regulation, data breaches and the Facebook data harvesting scandal.

The government has gone some way towards addressing such issues – for instance by creating the Centre for Data Ethics and Innovation and proposals to create data trusts to provide a framework for data-sharing between public and private sectors.

But at the same time, responsibility for data sharing, ethics, open data and governance was recently shunted from the central Government Digital Service to the Department for Digital, Culture, Media and Sport – a move Reform warned might be limiting.

Rather, it called for leadership on personal data sharing to come from the Cabinet Office, to ensure the data-sharing strategy “has influence that reaches across departments”.

In addition, Reform pointed out the role of chief data officer has been vacant since 2015; this role should include data-sharing policy to help break down silos.

Similarly, Reform urged Whitehall to remember the role local government must play, calling for it to be involved in establishing data standards and infrastructure.

“By giving local areas space to try and test data-sharing arrangements, it will help to demonstrate which projects are successful and could be scaled up regionally and nationally,” it said.

Allowing bodies the chance to test their work is a central part of Reform’s recommendations – it called on public bodies to offer synthetic datasets to share with others to ensure each one’s data adheres to the standards required by the others.

DCMS should also create a seal of approval to indicate data quality is satisfactory and biases have been accounted for, while the Framework for Data Processing in Government should include a Data Quality Assurance Toolkit and ensure public bodies are required to submit data for testing.

Civil servants' skills are also widely varied – Reform said several interviewees referred to "a particularly worrying skills gap within areas of local government" – and that more effort should go into training.

Reform concluded that effecting these changes across government "is not a quick and easy feat", saying that it would take time to tackle the technical barriers, provide the right training to staff and regain public trust. ®

Sign up to our NewsletterGet IT in your inbox daily


More from The Register

Will someone think of the taxpayer? needs to stop burning billions on shoddy procurement, says Reform

The solution: create a regulator with teeth and deeper pockets

Brit infosec firms urge PM Boris to reform the Computer Misuse Act

Let us compete globally, say threat intel outfits

Exodus: Tech top brass bail on £1bn UK courts reform amid concerns project is floundering

Exclusive Digitisation programme already at 'serious risk' of missing deadline

Australia’s .au admins told to reform or get rooted

auDA bins plans for direct .au sales to focus on governance and not pissing off members

OK, Google? Probably not! EU settles on wording for copyright reform legislation

Commission fumes at 'fake' campaign

Facebook stockholders tell Zuck to reform voting rules as data scandal branded 'human rights violation'

Spoiler alert: he didn't

Plans to thwack Official Secrets Acts smacked: Journo-gagging reform postponed

Why do we mention thrashing in the 'best schools'? Read on

European Parliament balks at copyright law reform vote

Pirates: We saved the internet!

£1bn UK justice system digitisation scheme in massive delay shocker

Now there's a surprise!

Take Sajid Javid's comments on IR35 UK contractor rules with a bucket of salt, warns tax guru

What now? A pre-election porky? Heaven forfend…


Reduce Redis Enterprise Deployment Cost, Complexity with Intel Optane DC Persistent Memory

Intel has prepared this Optane DC persistent memory kit to help you reduce Redis Enterprise deployments cost and complexity with 2nd generation Intel Xeon scalable processors and Intel Optane DC persistent memory.

The Rise of Machine Learning (ML) in Cybersecurity

While many are guarding the front door with yesterday’s signature-based AV solutions, today’s malware walks out the back door with all their data.

EMA Report: Network Detection and Response in the Cloud Comes of Age

"ExtraHop's new Reveal(x) Cloud SaaS offering for AWS takes the deployment burden away from AWS customers, enabling fast service provisioning and instant asset discovery, and providing threat detection, investigation, and response."

Secure Enterprise SD-WAN

Organizations are turning to SD-WAN as a cost-effective way to establish local internet breakouts and simplify traffic routing for the branch.