A Tweet that went viral from an airline passenger questioning JetBlue Airlines about its use of automated facial recognition at departure gates has called new attention to the growing use of automated facial recognition to identify and track travelers.
Our friends at the Electronic Frontier Foundation have an excellent analysis in their Deeplinks blog of some of the unanswered questions raised by this practice. We’ve talked about these before, in our blog and in meetings with DHS officials:
- What is the relationship between the government and its airline and airport “partners” for the use mug shots of travelers and related identifying information?
- Can travelers really opt out of airport mug shots, and if so how, especially if — as with ceiling-mounted cameras or other new airport designs for “touchless” passenger processing — facial images are automatically captured before travelers reach the point where they could ask to opt out
- What, if any, restrictions apply to use or “sharing” of the images and tracking data by airlines, airport operators (which are often local government agencies or other parastatal entities), or DHS components or other government agencies?
We agree completely with EFF that travelers should “Skip the surveillance by opting out of face recognition at airports” and that both members of the public and members of Congress should question what is happening , why, and whether it is legally justified.
But we also want to call attention to two additional aspects of this problem that have been overlooked or misinterpreted in much of the recent discussion: retention of facial images and accuracy of automated facial recognition.
While enforceable restrictions on retention of facial images are essential (and currently lacking), they would not be sufficient to mitigate the dangers of these practices.
Neither the government nor its private partners are primarily interested in collecting more photos of travelers. The goal of automated facial recognition is to enable identification and logging of the movements of individuals, which can then be used to track their activities and associations and make decisions as to what to allow or not allow them to do or where, when, and how to allow or not allow them to travel.
The key data for the government or other entities to retain or obtain is not the photo but the record that a specific individual was identified in a specific location, passing through a specific physical or virtual (and perhaps invisible) checkpoint, at a specific time.
By way of analogy, the key data retained and used by automated license plate readers are logs of the locations and times at which specific license plate numbers were identified. Photos of the license plates (sometimes including photos of entire vehicles and occupants) are often retained as well. But they aren’t essential to the use of timestamped vehicle position logs for vehicle surveillance and control. Similarly, none of the key uses and abuses of automated facial recognition in airports depend on retention of the facial images, only on the “travel history” records that they enable.
Nor does the problem lie in the inaccuracy of automated facial recognition. Use of warrantless, suspicionless, dragnet surveillance data — accurate or inaccurate — as the basis for pre-crime predictions or extrajudicial restrictions on rights is a problem. But more accurate and reliable surveillance technology is not a solution to this problem. The solution lies in an end to unlawful travel controls and attempts to restrict anonymous travel. If whether you are allowed to travel doesn’t depend on who you are (as it shouldn’t), then whether you are “correctly” identified should cease to be a problem.
We have no interest whatsoever in helping build a better surveillance mousetrap. Until these systems are shut down, we want to help travelers find ways to evade or confuse them and to travel anonymously or pseudonymously, as is their right. Members of the public have no duty to make life easier for the neo-Stasi. The less accurate records Big Brother has about our movements — if there are any records at all — the better.
We shouldn’t have to wear masks wherever we go to avoid having our movements logged by a pervasive infrastructure of surveillance cameras and automated facial recognition. Nor should wearing a mask or traveling anonymously be construed as suspicious.
But every time someone succeeds in evading, confusing, or generating incomplete, inaccurate, or spurious entries in travel surveillance logs — whether by wearing a wig or a mask or a hat or eyeglasses, altering their appearance, obscuring themselves in a crowd, being misidentified, or other means of traveling anonymously or pseudonymously — that’s a partial victory for direct action against the homeland-security surveillance state.
Feel free to share your ideas in the comments as to how we can defeat these surveillance systems, get them removed, and — in the meantime — render them less accurate and less useful to those who want to use them to surveil and control our movements.