No hiding place - facial biometrics will ID you, RSN
Falling costs and error rates threaten to eliminate privacy
Opinion One of the technologies that has been on a steadily accelerating price/performance curve is face recognition technology. A series of talks this year demonstrated just how quickly the software is improving.
Every year there’s an event in London called Biometrics. It’s a conference all about the state of biometric technology and is accompanied by an exhibition.
I’ve been going for a few years now, and I find it really interesting to walk the floor to see what is changing, to see which technologies are in and which are on their way out.
The state of the art can be illustrated by some of the examples given by the US National Institute for Standards and Technology (NIST) in their presentation on testing biometrics.
- A “1 in 1.6m” (that is, looking for one photo in a database of 1.6m photos) search on a 16-core 192Gb blade (about £25,000 worth of machine) takes less than one second (and the speed of such a search continues to improve). So if you have a database of a million people, and you’re checking a picture against that database, you can do it in less than second.
- The false non-match rate (in other words, what proportion of searches return the wrong picture) best performance is accelerating: in 2002 it was 20 per cent, by 2006 it was 3 per cent and by 2010 it had fallen to 0.3 per cent. This is an order of magnitude fall every four years and there’s no reason to suspect that it will not continue.
- The results seem to degrade by the log of population size (so that a 10 times bigger database delivers only twice the miss rate). Rather fascinatingly, no one seems to know why, but I imagine it must be some inherent property of the algorithms used.
We're still some way from Hollywood-style biometrics where the FBI security camera can spot the assassin in the Superbowl crowd. The false non-match rate is still too high. In a database of a million people, if you are searching for me you will get back hundreds, if not thousands of false matches. But it is the trend line that is the important thing here, not the instantaneous performance, and the initial use case will be for searching much smaller databases anyway. Say I’m at a conference, for example, and I see someone I recognise but can’t remember their name: I point my iPhone camera at them (surreptitiously) and ask the LinkedIn app to search my network, or perhaps I ask the conference organiser to identify them. In either case, the system is searching a database of only a few hundred faces.
That trend line is unlikely to take us down to a yes/no: as The Economist recently noted, biometric systems do not provide binary for such answers. They are “probabilistic” and this makes them inherently fallible. The chance of producing an error can be made small but never eliminated. While confidence in the results has therefore to be tempered by a proper appreciation of the uncertainties in the system, nevertheless, in a couple of years time you’ll be able to search a database of a million pictures, looking for me, and get back a dozen hits: me and a few false matches that you’ll inspect. That’s good enough, I’d suggest, to radically transform the way we interact. You can see that we are heading into some new territory. Even entry-level consumer software such as Picasa or iPhoto has this stuff built in to it.
iPhoto face recognition in action
It's not perfect, but it’s pretty good. And these kinds of “passive” biometrics are finding their way into all sorts of places. Microsoft’s new games platform creates a 3D model of a player. One of the team doing this points to a prosaic use case: “It knows when they are sneakily trying to log into their older brother's account and trying to cheat the system... You can’t do it. Your face is the ultimate detection for the device.”
There’s going to be some fallout, if you ask me. Here’s why: You have no control over what pictures of you other people post on the Internet. Suppose there’s a picture of me in a mosque somewhere or coming out of the Social Workers’ Party bring ‘n buy sale or heading in to an Alcoholics Anonymous meeting. I believe these things are private matters, so I can resolve not to mention them on my blog, not to post pictures of me in mosques, perhaps I might even be able to persuade my friends not to post any pictures of me at prayer, carrying a Lenin lampshade or on the scales. But someone I don't know, and who doesn't know me, takes a picture that has me in it and posts in on the web somewhere.
Meanwhile, someone has set their spider off crawling the Internet. My face is one of the faces loaded from LinkedIn, or our corporate website, or a conference site, or wherever. The spider finds my face in the lampshade picture and adds it to the catalogue. Now, the “secret” is out, and catalogued, and there's nothing that can be done about it. Nothing.
Now, some people would argue that there are legitimate reasons for wanting this kind of system. For instance, in the Mexican city of Leon – where drug cartels are rife – iris and face scanners are being installed to analyse approximately 50 people per minute as they walk the streets. This means that the system can monitor an entire room and keep a constant watch over who is present, sending identification information to relevant authorities.
We should bear in mind that this technology is available to the drug cartels as well, so if they’re not getting the feed from this system, they’ll soon make their own. In fact, pretty much anyone will be able to have their own system like this, and they won’t even have to install the cameras themselves... In the UK, a new service pays the public to monitor live commercial CCTV footage online! It’s just been launched in Devon. Internet Eyes will pay up to £1,000 to subscribers who report suspicious activity such as shoplifting.
Remember those distributed tasks that we used to download as screensavers? Any day now we'll be able to download a Crimewatch screensaver that scans the CCTV feeds while we're not using our computers and looks for the top 10 most wanted. And debt management companies will be able to look for defaulters and the DWP will be able to look for deadbeat dads, and so on.
Unless we introduce a firm plan for online anonymity pretty soon, we’re not going to have any anonymity at all. What I mean by this is that I cannot see any plausible roadmap that delivers offline privacy other than wandering around all day wearing comedy disguises (see, for example, the recent assassination in Dubai)... and they will only get us so far, before voice analysis, gait analysis and so on take their toll. The falling costs of biometrics and the exponential power of Big Brother (ie, us) not only remove privacy as a possibility, they do so in fairly short order. This may have the unexpected consequence of driving more interpersonal and corporate interaction into virtual worlds. It is only in virtual worlds that technology is available in any reasonable timescale that can deliver individual privacy.
I wrote some years ago that one might imagine a flight to virtual communities, where mathematics (in the form of cryptography) provides an impenetrable defence against crime and disorder that the metal barriers of a gated community cannot. This makes it all the more of a priority that any framework that we develop to manage identities in cyberspace is centred on privacy. So write to your MP immediately and tell him or her that you want “user-centred identity” and “privacy-enhancing technology” at the heart of the government’s IT strategy. (I’m sure they’ll know what you mean.) There is no point complaining about the use of biometrics in the real world, but there is a point in building a virtual alternative. ®