FBI techs shy away from facial recognition
Spends 40 years losing face
A senior FBI technologist declared last month that after decades of evaluation, the agency sees no point in facial recognition.
Speaking at last month's Biometrics 2009 conference in London, James A Loudermilk II, a senior level technologist at the FBI, outlined the agency's future biometrics' strategy.
He said that 18,000 law enforcement agencies contribute fingerprints and DNA samples to the FBI’s databases and, at their peak, they submit 200,000+ identity verification queries a day. It’s a big operation, and it’s only going to grow, he said.
Under the Next Generation Identification initiative, an irisprint database is likely to be added to the existing fingerprint and DNA databases.
Fingerprints are likely to be amplified with friction prints of other ridges, probably palmprints and maybe footprints. Voiceprints are also being evaluated. Anything that can feasibly increase public safety.
Loudermilk said his aim was to get the current turnaround time for laboratory staff from DNA sample to profile down from 8 to 10 hours to 1. He said the technology was there already, it was a question of feeding it down the levels of law enforcement to every precinct booking station. Once the agency gets turnaround time to an hour, then perhaps the idea of sampling an entire planeload of passengers starts to look feasible.
What will be missing from this mix, however, is facial recognition.
Facial recognition would be the killer application of biometrics, Loudermilk told the hundreds of conference delegates, and the FBI would dearly love to be able to use facial recognition in its fight against crime.
But it can’t. The algorithms just don’t exist to deliver the highly reliable verification required.
This is even though the FBI has been evaluating facial recognition technology since 1963, he said. It didn’t invest then. It’s not investing now.
Despite the FBI's rubbishing of the technology, delegates from other policing agencies and vendors queued up to declare their intention to introduce facial recognition or claim the technology worked.
These included Alex Lahood of the UK Border Agency. He reiterated former Home Sec John Reid's pledge to check the identity of everyone entering and leaving the UK by 2013. When asked how, he said, probably face recognition and fingerprints.
Clearly, the FBI's word is not good enough for HM government. ®
MinionZero @ Thursday 5th November 2009 13:30
Fair bit of noise in that post of yours, MinionZero, and two strong signals.
Signal #1, the horrors of a surveillance state. Utopians believe that there is a perfect state of affairs. They look at mankind and see a terrible gap between the way life is – imperfect – and the way it should be. That makes mankind hateful to the utopian. Mankind needs to be perfected, according to the utopian. And luckily he, the utopian, is just the man to do the perfecting, he is the exception, he is perfect, and so he sets about destroying the institutions that have evolved to support mankind and replace them with perfect ones. We know the result. Whether they call themselves communist of fascist, these utopians inevitably make life hell for people. Inevitably, because anyone who believes they know how to perfect mankind must believe that they are some sort of a god and that is a delusion we would normally diagnose as insanity. Inevitably also, because what they start with is a hatred for mankind.
Utopianism = insane hatred. Enough.
Signal #2, we shouldn't expect 100% reliability from biometrics, they can be useful even if reliability is lower than that and, anyway, they improve over time.
Couldn't agree more. That is why, in my unread disquisition on biometrics, http://dematerialisedid.com/Biometrics.html#homework, I ask the reader to decide in advance what he or she finds to be an acceptable level of error. Take the UK population to be 60 million. A 1% error rate in the biometrics used by the state would give 600,000 of our fellow countrymen a problem. Maybe that's acceptable, considering that 59,400,000 would benefit from those biometrics. If the error rate hits 10% and 6,000,000 people face problems as a result, then perhaps it's less of a problem to decide that that's unacceptable.
The point is that the error rate (false non-match rate, FNMR) for flat print fingerprints seems to be around 20%*. No-one, I suggest, would decide in advance that 12,000,000 people should face problems. It's just off the scale.
And when it comes to face recognition technology, the error rate is in the range 30-50%+*. This isn't a technology that's more or less there, it just needs a bit of tweaking. It's an outright failure. It's certainly nowhere near ready to be released on the public.
Compare the drugs industry. We allow drugs onto the market even though they have some side-effects. The decision to release them is based on the expert scrutiny of acres of evidence.
It should be the same in the biometrics industry, you may say, but it isn't. In the UK, the Home Office argue that the 2004 UKPS biometrics enrolment trial wasn't really a trial. That allows them to say that the FNMR for flat print fingerprinting isn't 20% and for face recognition it isn't 50%. But then, what *is* the FNMR for these biometrics for a 0% false match rate? They won't say, please see http://forum.no2id.net/viewtopic.php?p=107567&highlight=#107567 As noted in the article we are commenting on, the Australians are the same, they won't release any statistics on the reliability of smart gates.
There is something seedy about a technology industry that won't publicise its results. That's no way to do business. It's irresponsible.
And that is why it is a breath of fresh air when any of the luminaries of the industry *do* speak publicly. Nigel Sedgwick, for example. And Tony Mansfield.
Tony Mansfield is, I think, something of a kingmaker in the world of UK biometrics. If you get his backing for your biometrics technology, you've got a good chance in the market. He told me (or emailed me, to be more precise) that when he and Marek Rejman-Greene were doing their feasibility study (*) for the Home Office, they just couldn't believe how bad face recognition technology is. They got worried about their results. So they looked at other people's evaluations and found even worse results. Which gave them added confidence in their recommendations against face recognition technology.
The problem seems to be that faces keep changing shape, remarkably quickly. Two months after your photograph is taken, face recognition technology is utterly useless, according to their feasibility study. In fact, it may be worse than that. At the UKPS biometrics enrolment trial, verification was performed 5 *minutes* after the photograph was taken. And still we got FNMRs in the range 30-50%! That may have been an unintended consequence of the trial and the trial may not have been run very well but unintended consequences are still consequences and a badly run trial may be precisely how the technology is used at UK airports and elsewhere.
This matter came up at the Biometrics 2009 conference, at lunch with James A Loudermilk II of the FBI and John Mears, Director of Biometric Solutions at Lockheed Martin. There was much chortling at how photographs become more and more useless to face recognition technology, the older they are. A point made to support the fact that for 46 years the FBI have failed to endorse face recognition technology and there is still no reason to change that view even though other departments of state occasionally lean on them to relent.
Mr Loudermilk also explained one reason why flat print fingerprinting is so unreliable compared with traditional fingerprints, rolled prints, taken by police experts, using ink. It's simple. Flat print fingerprints are flat. They miss 40% of the fingerprint, the bits on the side, that you can only get at by rolling.
But that's just one man emailing another man or three men talking over their sandwiches. It's not in the public domain.
The marvellous breath of fresh air at Biometrics 2009 was to have Mr Loudermilk standing up there on the stage in front of hundreds of delegates saying explicitly that the algorithms simply do not exist to allow face recognition technology to deliver the highly reliable verification required. For several hundred pounds, you can buy the DVD and watch him say it. It's public domain, at last, and now the Home Office must answer the questions about the reliability of the biometrics they are buying with our money, http://dematerialisedid.com/PressRelease19.html
They can't pretend that face recognition technology works for verification even if it doesn't work for identification as Tony Mansfield sometimes does (*). Or that it works for small populations even if it doesn't work for big ones. It's too late. The bag no longer contains the cat. The FBI, thank God, let it out.
MinionZero, face recognition technology doesn't deserve you. Your stochastic sampling is wasted on it. It just doesn't work.
* Please see http://www.theregister.co.uk/2009/08/14/biometric_id_delusion/ for supporting references
My apologies, some days I just can't get anything right ...
@ D Moss Esq
Oi, who says it's Mr?
It's just plain A J to you, thank you very much. My life is lived in the common gender .....