Original URL: https://www.theregister.co.uk/2006/05/26/bichard_part3/
Bichard reports widespread errors in Police data
Still flakey after all these years
Almost half of all police forces that have have been audited by the police database team of HM Inspectorate of Constabulary have been pulled up for duff data management, said the third report of the Bichard enquiry yesterday.
Sir Michael Bichard's enquiry into the intelligence failures leading up to the murders of Soham school girls Holly Wells and Jessica Chapman led to a 2004 report that recommended measures to improve the quality and timeliness of data input into the Police National Computer (PNC). HMIC's audits of Britain's 51 police forces were subsequently trained through Bichard's lens.
The "progress report" published yesterday was meant to demonstrate how well the Home Office had responded to Bichard by making police data more reliable. It showed how there was a long way to go before police data could be treated as gospel.
"HMIC has commenced direct communications with 13 forces which are causing varying degrees of concern in relation to their actual performance or their general direction of travel," said the progress report.
It noted evidence provided by HMIC audits about the timeliness of data input into police computers.
Almost a third of British forces were not meeting tough statutory targets for inputting data about arrests and summons on the computer in time, it said, drawing its data from the completed audits of data quality and related working practices HMIC has done of British police forces.
It also noted that 39 per cent of forces were not inputting records of court proceedings within statutory deadlines.
But it skirted over the other key data concern for Bichard, that of data quality. Error rates of between 15 and 86 per cent were identified in police data in the years before the Soham murders. Data errors are still a problem, as demonstrated by recent string of reports about the Criminal Records Bureau, which draws its data from the PNC.
HMIC's audits, called Police National Computer Reports, do take an interest in erroneous police data, but it is limited by the Bichard recommendation that data input into the PNC is of good quality. It's scope does not include existing data.
The last Home Secretary had ordered HMIC to concentrate its efforts on assessing timeliness, and that's what Bichard reported. Yet data quality is a concern for civil liberties organisations and those people who end up on the wrong side of the law because of data errors.
The most recent PNC audit report published by HMIC, that of Avon and Somerset Constabulary, noted that 22 per cent of records that had already been checked by supervisors still contained an error. The error rate concerned a sample of records input in recent months. Old data, which might contain more errors, is not audited.
Nevertheless, HMIC's work, and constant mithering of those police forces that do not have accurate databases, has brought about gradual improvement. Statutory guidance for data quality and related working practices was introduced in April.
David Stevens, programme chair of Impact, the police intelligence computer effort prompted by Bichard, told an audience at the Association of Chief Police Officers conference this week, that input was still a problem.
"At the moment there are no penalties for feeding garbage into a computer, but wrong information can have serious consequences," he said.
He refused to elaborate. Reconsidered, the official view of Impact is that penalties won't help; a reverence for data quality must become part of the culture of the police.
Stevens said the guidance and standards for handling data will bring about that cultural change. So will the work of HMIC.
But it is slow work. HMIC has audited 31 of 43 police forces in England and Wales and seven of eight in Scotland. It works on a three-year cycle and is hoping to have completed its first round of PNC audits by October.
Each audit generates an action plan designed, as the progress report said, to "address shortfalls in performance". Data quality will then be fostered by careful monitoring and encouragement of those forces that have the most glitches in their data.®