FYI: Get ready for face scans on leaving the US because 1.2% of visitors overstayed their visas

Uncle Sam wants to run fliers' fizogs through photo databases by 2023

facial

Water cooler Hey El Reg, help me out with this. I saw someone on Twitter complaining that she had to have her face scanned by US airline JetBlue before flying overseas from America. Why would they be doing this?

According to food writer MacKenzie Fegan, she was asked to undergo a facial scan at a computer terminal at the departure gate just before boarding her JetBlue flight, rather than have a flight attendant scan and check her boarding pass. Given the current debate over AI-based facial recognition and privacy, this made her nervous and she went on social media to discuss it.

JetBlue's Twitter reps replied that the facial scan wasn't mandatory, and she should have been able to opt out. They then explained that her picture was matched against a photo database maintained by the US Customs and Border Patrol, to verify her identity and that she was listed as a passenger on the flight.

The airline explained:

Er, that sounds somewhat dodgy from a privacy perspective?

Many would agree, not least the Electronic Frontier Foundation:

But as JetBlue has pointed out, the facial scanning is not new. It even put out a press release about the practice. The airline started trialing such systems in 2017 and officially began operating the technology at New York's JFK airport last November.

So what's this about – cost savings, crime, Big Brother?

Possibly a little of all three.

First off, JetBlue says it will provide a more seamless travel experience – meaning there's none of that pesky eye-contact and handing over of boarding passes fliers have to endure. The facial recognition system is designed to speed up the boarding process and, in turn, free up JetBlue staff for other duties rather than scanning passes at the gate.

But there's a bigger picture here, and it turns out JetBlue is merely getting into facial recognition early. Every other airline operating in the US will also be introducing the facial recognition technology in an effort to combat illegal immigration under a program called Biometric Exit.

In a memo [PDF] issued last week, US Homeland Security gushed about the effectiveness of Biometric Exit, saying that the pilot program in 15 airports had already identified and made note of 7,000 visitors who had overstayed their US visas as they flew out of the States – ensuring they'll most probably never be allowed into America again – and promising to roll it out over all airports by 2023.

In other words, travelers will just have to get used to it: if you overstay your visa, or right to be in the country, you'll be identified on the way out, a black mark put on your record, and unlikely to be allowed in again.

Is this like a build the wall thing, only for airports?

Not really, this is more about identifying people who entered the US legally in the first place, rather than stopping folks sneaking across the Rio Grande.

The majority of immigrants unlawfully in the US are people who came here on a legit visa, be it tourist, work or student, and didn't leave when they were supposed to. According to Homeland Security, students are the biggest culprits for this (looking at you, Apu) and it estimates that more than half of student visa holders from Chad and Eritrea overstay their visa time limit in 2018.

Taking into account the different types of visas, Homeland Security estimates that, overall, around 1.22 per cent of overseas visitors (that's 666,582 people) outstayed their welcome in the Land of the Free™ last year. Biometric Exit is designed to clock those folks just as they are about to board a plane out of America.

The Biometric Exit scheme, from what we can tell, compares photographs of visitors exiting with photos of people when they arrived: if there's a match, visa records are checked, and those who overstayed have their files updated. It's designed to replace the current system where Customs and Border Patrol agents match passenger manifests to lists of overstayers.

So, if you've done nothing wrong, you have nothing to fear?

Well, that's always been a rather facile argument. Who knows what this administration, or the next one, or one in the future, will use this technology for.

The fact of the matter is facial recognition systems are still in their infancy, and suffer some serious problems. The datasets used to train such neural networks are sometimes of poor quality, and the algorithms may not be all that accurate, particularly when it comes to people of color. False positives are part and parcel of advanced computer systems like this, and that's going to be little comfort to someone who is misidentified at boarding time.

There are also concern that the information being collected could be used by other federal agencies for almost any purpose, since at the moment there is very little in the way of case law and regulations restricting government use of facial recognition. One potential privacy benefit is that the airlines themselves don't act as guardians of your facial data, nor can they use it directly – as far as we know, at least. It's all held in Uncle Sam's databases, which is supposed to be reassuring?

Seeing as the US government hasn't protected its most critical data in the past, don't get too comfortable. ®




Biting the hand that feeds IT © 1998–2019