Brit Home Office bods have denied that retaining millions of custody images of people who were never charged falls foul of case law, while asserting that automatic deletion is not technically possible.
In an intense evidence session with MPs this morning, Home Office minister Susan Williams and the department's director of data and identity, Christophe Prince, were grilled over the government's approach to biometrics.
The witnesses were brought in front of the Science and Technology Committee to answer questions about the long-delayed biometrics strategy, which was due in 2012 but is still unpublished.
Williams got off to a good start by announcing that it was now slated for publication in June, saying the delay was caused by the need to narrow the scope of the strategy and ensure it was in line with the rapidly changing technology.
However, discussion got increasingly heated as the committee moved on to the government's heaving custody image database, which has amassed a whopping 21 million images of faces or identifying features like tattoos or scars.
South Wales cops crow about facial recognition arrests on social mediaREAD MORE
But these are kept on file regardless of whether or not that person is charged – and in 2012 the High Court ruled that keeping images of presumed innocent people on file was unlawful.
Despite this, the Home Office's solution, set out in last year's Custody Image Review, was to remove images only if someone in them complained – and this request can be denied in "exceptional circumstances".
That's in contrast to the policy on DNA and fingerprint evidence, which is wiped automatically if someone isn't charged.
The committee chair, Lib Dem MP Norman Lamb, argued that the government had effectively admitted that such images should only be kept under exceptional circumstances – and so the retention of images of presumed innocent people was "surely unlawful".
Williams and Prince maintained that the implementation of the Custody Image Review was sufficient to meet the 2012 ruling – and said it wasn't technically possible to set up an automatic system.
"There's no mechanism at the moment to automatically connect the non-conviction to the custody image to prompt the police to make that removal," said Prince. He also pointed to the review process carried out after six or ten years as evidence that no images would be retained indefinitely.
Police deny Notting Hill Carnival face recog tech led to wrongful arrestREAD MORE
Williams' responses were somewhat less assured, especially under pressure from Lamb.
"I think that the 2012 ruling said that what we were doing was lawful, but... because there is no sort of automatic – within the technology, there's no automatic deletion mechanism... and it's something that obviously, in due course..." she said, trailing off.
When pressed on why it was possible to automatically delete DNA and fingerprints but not images, Williams said it was "because they're two different systems", adding that they had "grown up in different ways".
Prince, meanwhile, tried to argue that the use of DNA and fingerprints in the criminal justice system was "significantly more advanced".
Elsewhere in the hearing, the committee quizzed the witnesses on the police's use of automated facial recognition technology – but failed to get an answer on the occurrence of false positives, or on the specifics of what happened at recent trials in South Wales and London.
Williams did agree that there was a need for wider engagement with the public in order to build up public trust, saying that the biometrics strategy would help with that.
But, she argued, "there is quite an interesting juxtaposition here, where members of the public will give the most incredible amount of data to a faceless internet company but are suspicious of what government might be doing".
To which Lamb retorted: "I suppose it's because the government runs the criminal justice system." ®
Sponsored: Webcast: Simplify data protection on AWS