Mar 15 2017

Palantir, Peter Thiel, Big Data, and the DHS

San Francisco and Silicon Valley are among the centers of opposition to President Trump and his fascism, especially as it relates to restrictions on movement, border controls, immigration, and asylum.

Bay Area technology companies and their better-paid classes of employees like to think of themselves as building a better world that reflects the distinctive values that have attracted dreamers and futurists to this region  from across the country and around the world. But some of these companies are key developers and providers of “big data” tools for the opposite sort of “Brave New World“.

On Saturday, Edward Hasbrouck of the Identity Project was invited to speak to an ad hoc group of picketers outside the Pacific Heights mansion of Palantir Technologies founder and Trump supporter Peter Thiel (photo gallery from the SF Chronicle, video clip from KGO-TV; more photos from the East Bay Express).

As Anna Weiner reported in the New Yorker (“Why Protesters Gathered Outside Peter Thiel’s Mansion This Weekend“):

David Campos, a former member of the San Francisco board of supervisors, who emigrated from Guatemala, in 1985, stood on the brick stoop and raised a megaphone. “The reason we’re here is to call upon the people who are complicit in what Trump is trying to do,” he said. Clark echoed the sentiment. “If your company is complicit, it is time to fight that,” she said. Trauss, when it was her turn, addressed Thiel, wherever he was. “What happened to being a libertarian?” she asked. “What happened to freedom of movement for labor?”

Edward Hasbrouck, a consultant with the Identity Project, a civil-liberties group, took the stand, wearing a furry pink tiger-striped pussyhat. “The banality of evil today is the person sitting in a cubicle in San Francisco, or in Silicon Valley, building the tools of digital fascism that are being used by those in Washington,” he said. “We’ve been hearing back that there are a fair number of people at Palantir who are working really hard at convincing themselves that they’re not playing a role — they’re not the ones out on the street putting the cuffs on people. They’re not really responsible, even though they’re the ones who are building the technology that makes that possible.”

It’s easy to rationalize the creation of technological tools by saying that they can used for good as well as evil. But you can’t separate the work of tool-making from the ways those tools are being used. Palantir workers’ claims to “neutrality” resemble the claims made in defense of IBM and Polaroid and when they were making and selling “general purpose” computers, cameras, and ID-badge making machines to the South African government in the 1970s. None of this technology and equipment was inherently evil. But in South Africa, it was being used to administer the apartheid system of passbooks and permissions for travel, work, and residence.

The same goes for “big data” today. To understand what’s wrong with the work being done by Palantir for the US Department of Homeland Security, it’s necessary to look not just at what tools Palantir is building but at how and by whom they will be used; not just at the data tools but at the datasets to which they are applied, the algorithms they use, and the outcomes they are used to determine.

Palantir Tech is under contract with the DHS to build the Analytical Framework for Intelligence (AFI) for US Customs and Border Protection (CBP), and the Investigative Case Management (ICM) system for US Immigration and Customs Enforcement (ICE).

While development of AFI and ICM was commissioned by different DHS components, both are front-end data mining, data visualization, and data analytics modules for the TECS system, a “system of systems” nominally overseen by CBP but used throughout DHS, by at least nine other Federal departments, and by state, local, and foreign “partners”.

Some Palantir workers have made much of the distinctions between DHS agencies and between components within ICE, defending their work on the theory that the launch customers for these Palantir systems aren’t directly involved in deportation or exclusion of would-be immigrants. But even if this were true today (it isn’t, as discussed below), both AFI and ICM are part of a long-term, high-priority TECS Modernization (“TECS Mod”) program specifically intended to eliminate this sort of data “compartmentalization” and make all government or commercial data available to any DHS component or other government agency available to all such components, so that they can better “connect the dots” or establish guilt by association. Dianne Feinstein, the US Senator from Silicon Valley who lives on the next block from Peter Thiel in San Francisco, has been one of the harshest critics of any technical or policy barriers to inter-agency data sharing.

The eventual goal of the TECS Mod project is to replace all existing TECS systems and pool all the data accessed by TECS in a new shared data lake called the “DHS Data Framework” — not to build a separate toolkit for each agency. If AFI and/or ICM are considered successful by their initial users, they will be incorporated into the DHS Data Framework and made available to other users of the current TECS platform.

What data is used by Palantir’s AFI and ICM tools?

Both AFI and ACM use data imported and aggregated from TECS, other DHS and government databases, and external commercial sources.

TECS records relied on by both AFI and ICM include Passenger Name Records (PNRs) or airline reservations from the Automated Targeting System (ATS), border crossing and entry/exit logs (lifetime international “travel history” files), and free-text notes made by border guards and other CBP staff (and kept in travelers’ permanent files even when no violation of law was suspected).

All of this data is best understood as the product of warrantless, suspicionless (and illegal) dragnet surveillnace. Just as the NSA collects metadata about the movements of our messages, DHS collects this metadata about the movements of our physical bodies. Since 9/11, airlines have been required to spend more than US$2 billion, by DHS underestimate, to integrate DHS root access and positive controls into their IT systems and those of their IT outsourcing providers. The government gets and retains mirror copies of your international airline reservations, regardless of whether you are under any suspicion whatsoever.

Like other DHS systems, AFI and ICM both also make use of “garbage in, garbage out” aggregations of unverified personal information from commercial data brokers.

Any of this data can be kept secret, at the “discretion” of the DHS. You can and should request your TECS and other travel records from DHS, but the response is likely to be heavily and arbitrarily redacted. You are judged on the basis of this file whenever you fly or cross the US border, and could be asked about entries in it years later, so it’s useful to know as much as possible about what it says. But in response to our request for these records, CBP first put us off for three years and then exempted the whole system of records from the Privacy Act. We sued, but the court approved the retroactive application of the exemption to the request we had made made three years earlier. And that was a request from, and a lawsuit by, a US citizen. Non-US persons have no rights under the Privacy Act. In his first week in office, President Trump issued an Executive Order forbidding any Federal agency from using its discretion to tell foreigners what records it keeps about them.

What algorithms are used to analyze this data?

Because the algorithms for mining and analyizing TECS data and other DHS travel records are secret, almost everything we know about these algorithms comes from reverse engineering. In my lawsuit, we requested information about how CBP mines travel history data. But the court agreed with CBP that this was exempt from the Freedom Of Information Act.

One category of algorithmic rule whose existence and routine use has been explicitly confirmed is the “TECS Lookout” for a specific person. A “TECS Lookout” is a rule that is triggered and generates an e-mail message to a specific law enforcement officer whenever a reservation matching specific criteria such as a name and/or passport number of a “person of interest” is sent to CBP by an international airline, typically 72 hours before the flight. The agent can then arrange a “welcome party” at the port of entry or departure to detain, search, interrogate, or passively surveil the person of interest.

Because this all takes place in the area of the border or its virtual equivalent, a “TECS Lookout” can be used at any time, against anyone, for any reason, without warrant or suspicion. A TECS lookout can be set by any Federal law enforcement officer (it’s a standard tool in the manual for IRS tax collectors, for example), or by a Federal agent at the request of state or local police.

The best-documented use of a TECS Lookout was against David House, an MIT computer scientist and volunteer with the Chelsea Manning Support Network. The Army wanted to interrogate Mr. House and search his electronic devices, but lacked probable cause to suspect him of a crime or obtain a warrant. So an Army investigator asked ICE to set a TECS Lookout for Mr. House, which was duly triggered a few days before Mr. House was scheduled to fly to Mexico for a vacation.

ICE had no jurisdiction over Mr. House as a US citizen. But that had no effect on ICE’s ability to set a TECS Lookout for him or to search and interrogate him and seize and copy his electronic data at the airport when he arrived in the US. All of this was done by the Homeland Security Investigations division of ICE — the same ICE component that is the launch customer for Palntir’s Investigative Case Management (ICM) system.

Algorithms applied to TECS data also include a variety of individual and categorical/descriptive rules that are euphemistically described as “watchlists”. It’s more accurate to call them “blacklists”, since the consequences of a “match” with a “watchlist” rule aren’t usually limited to passive “watching”. Watchlisting/blacklisting rules are secret and have almost never been reviewed by the courts.

Only one person has ever been told why they were placed on a DHS watchlist or blacklist: Dr. Rahinah Ibrahim was placed on the “no-fly” list because an FBI agent assigned to surveil mosques and interview Muslims misunderstood a “negative check-off” watchlist nomination form, checked the boxes he was supposed to leave blank, and left the box blank that he was supposed to check.

Dr. Ibrahim, then a Ph.D. candidate at Stanford University, was arrested at SFO the next time she tried to check in for a flight. She was eventually allowed to fly back to Malaysia, but then her US visa was revoked (a decision not subject to judicial review under US law). She defended her dissertation and received her Ph.D. in absentia, and is now chair of her department at a university in Malaysia. Our loss. Although one of her children was born in the US and is a US citizen, Dr. Ibrahim has never been able to return to the US to pursue the opportunities she had been offered for commercial exploitation in the US of her doctoral research.

Dr. Ibrahim learned how she had been blacklisted only after nine years of litigation, costing her pro bono lawyers US$3.9 million and culminating in a week-long trial in Federal Court San Francisco that we reported on in 2013. Neither Dr. Ibrahim (of course) nor her US-citizen daughter, who was to have been a witness, were able to come to the US to attend the trial. To date, this is the only challenge to a no-fly, watchlist, or blacklist decision to make it to trial. In the course of the lawsuit, Attorney General Holder signed a perjured affidavit claiming that it would jeopardize national security to reveal whether or why Dr. Ibrahim had been blacklisted, as though it were state secret that an FBI agent made a mistake.

More complex algorithms are used for predictive scoring and pre-crime profiling, despite the complete absence of any evidence that the DHS or Palantir have any “pre-cogs” — human or robotic — who can discern whether people not accused of any crime, or subject to an injunction or restraining order, pose a future “threat”.

It’s important to note that none of these or any of the other algorithmic rules for profiling, analysis, or decision-making based on travel records or commercial data reflect court orders, judicial fact-finding, or any pretense of due process. None of these rules has ever been reviewed by any court. The US government claims the right to search and surveil anyone crossing an international border without warrant, probable cause, or suspicion, and refuses to recognize that travel is a right guaranteed by law, the Constitution, and international treaties. All of these secret rules for analyzing secret dossiers, profiling travellers, and assigning consequences based on secret algorithms operate in the law-free zone of US borders and international airports, and have almost entirely evaded judicial review.

Does AFI or ICM already contain a rule to identify or locate Muslims or their associates? Probably not. But could such a rule be added? Yes, easily. PNRs included in ATS, stored on the TECS platform and accessible to and/or ingested into AFI and ICM, include standardized codes that identify anyone who orders a Halal meal on an international flight to or from the US. Some European airlines try to filter out this particular “sensitive” data, but US and most other foreign airlines aren’t required to filter it out, and don’t even try. And it only takes one record of a Halal meal on any one flight to flag you for life as a Muslim in your permanent file with the DHS.

If an analyst using AFI or ICM wants to generate a list of everyone who has ever ordered a Halal meal, or a map of where to find them, or a social network graph of the relationships between them and of everyone else with whom they are associated in the dataset, that’s exactly the sort of analytic and data visualization task for which Palantir’s tools are designed. And it’s fully supported by the existing content of the dataset on which these tools are designed to operate.

What consequences are determined by this data processing and analysis?

“Vetting” or “extreme vetting” of immigrants, visitors, and US-citizen travelers means making decisions about what they are, and are not, allowed to do. Those decisions are made by multiple DHS components, and other collaborating pre-crime policing agencies, in multiple locations.

Scoring, profiling, or flagging you as a “person of interest”, placing you under suspicion because of associations or pattern matching, or sending a surveillance and tracking alert to an investigator with some law enforcement agency are the least of the extrajudicial consequences that can be assigned through the use of Palantir’s decision-support “vetting” tools:

  • Anyone, regardless of citizenship, can be singled out for detention, delay, and more intrusive search or questioning at any border or international port of entry or exit.
  • A non-US person can have their visa or ESTA visa waiver status revoked (without right of judicial review) and be flagged for removal by ICE (if they are in the US) or refusal of re-entry by CBP (if they are out of the US when their visa is revoked).
  • A non-US person can be denied entry on arrival at a US border or port of entry (even if they have a visa or ESTA), and detained pending deportation.
  • Anyone, including a US citizen, can be denied boarding by an airline, anywhere in the world, on the basis of a “Boarding Pass Printing Result” or “recommendation” from a CBP pre-crime robot or a CBP agent at the National Targeting Center, one of the Regional Carrier Liaison Groups, the Passenger Analysis Unit inside the CBP office at SFO or another US international airport, or one of the CBP “advisors” permanently stationed at foreign airports and foreign passenger targeting centers. The US asserts a baseless claim to extraterritorial jurisdiction over who is allowed to board foreign-registered aircraft at foreign airports. For an asylum seeker, denial of boarding can amount to a death sentence, by preventing flight from a place of grave danger to a place of refuge. This is not a hypothetical issue: The first court order against President Trump’s #MuslimBan2.0 Executive Order, for example, enjoined CBP from trying to prevent a Syrian refugee family currently in danger in Aleppo from boarding a flight (most likely from Turkey) to the US where they hope to seek asylum on arrival.

People who can get a job with Palantir are likely to have other employment options. It’s easy to forget, in the closed world of your cubicle, that you are dealing with data about real people. Algorithmic operations affect real lives. It is on the basis of this data, these algorithms, and these consequences for real people that we hope Palantir workers will think about what they are doing, whether it reflects their values, and whether they are building the world they want to live in.

13 thoughts on “Palantir, Peter Thiel, Big Data, and the DHS

  1. Pingback: Palantir, Peter Thiel, Big Data, and the DHS – Cryptocurrency blog and news agregator

  2. Excuse me, but these are the same Silicon Valley thought leaders that illegally manipulate the H1B visa system to bring in foreign tech workers to replace American citizen workers, even demanding that the displaced citizens train their replacements in order to get a fair separatation package.

    I say only traitors cooperate with these Silicon Valley turncoats.

  3. Pingback: Papers, Please! » Blog Archive » Amtrak lied to travel agents who questioned ID requirements

  4. Pingback: Papers, Please! » Blog Archive » “AFI” is the latest DHS name for “extreme vetting”

  5. Pingback: Papers, Please! » Blog Archive » FAQ: U.S. government monitoring of social media

  6. Pingback: Papers, Please! » Blog Archive » Silicon Valley Is Building the Infrastructure for a Police State

  7. Pingback: Is the DHS using this Unisys pre-crime software? | Papers, Please!

  8. Pingback: Icebreaker Pt 7: ICE Case Management Handbook Based on Federal Law Enforcement "System of Systems" - UNICORN RIOT

  9. Pingback: Airports of the future: surveillance by design – Papers, Please!

  10. Pingback: “Put them on the no-fly list!” – The Mad Truther

  11. Pingback: “Put them on the no-fly list!” – Papers, Please!

  12. Pingback: DHS uses travel as pretext for search of researchers and journalist – Papers, Please!

Leave a Reply

Your email address will not be published. Required fields are marked *