Ahead of its Sept. 30 public listing, Palantir, a secretive data analytics company with ties to the defense and intelligence communities, has been touting itself as a tool that can “target terrorists and keep soldiers safe.” With an expected valuation of about $22 billion, it will become one of the biggest surveillance companies in the world.
Palantir promises that where “technological infrastructure has failed,” it can create clarity and order. But to deliver, its software needs data — lots of it.
Now, two never-before-seen documents, “Intermediate Course” and “Advanced Course” training manuals, reveal how the Los Angeles Police Department has taught its officers to use Palantir Gotham, one of the most controversial and powerful law enforcement tools in the world.
Much of that LAPD data consists of the names of people arrested for, convicted of, or even suspected of committing crimes, but that’s just where it starts. Palantir also ingests the bycatch of daily law enforcement activity. Maybe a police officer was told a person knew a suspected gang member. Maybe an officer spoke to a person who lived near a crime “hot spot,” or was in the area when a crime happened. Maybe a police officer simply had a hunch. The context is immaterial. Once the LAPD adds a name to Palantir’s database, that person becomes a data point in a massive police surveillance system.
Palantir — which takes its name from the all-seeing stones in The Lord of the Rings — and the LAPD argue that these documents show how the system protects the public. It allows the police to quickly find criminals by plumbing a vast amount of personal information — plucking a suspected killer from a haystack.
But to critics, these documents show something vastly different. At great taxpayer expense, and without public oversight or regulation, Palantir helped the LAPD construct a vast database that indiscriminately lists the names, addresses, phone numbers, license plates, friendships, romances, jobs of Angelenos — the guilty, innocent, and those in between.
LAPD’s Palantir database includes information from the DMV, meaning people with a California driver’s license can be swept into Palantir. It also includes 1 billion pictures taken of license plates from traffic lights and toll booths in Los Angeles and neighboring areas. If you’ve driven through Los Angeles since 2015, the police can see where your car was photographed, when it was photographed, and then click on your name to learn all about you.
Palantir is no small thing within the LAPD. Almost 5,000 people, more than half of all LAPD officers had accounts on Palantir, according to an “LAPD Palantir Usage Metrics” document. The same document says that in 2016, those officers ran 60,000 searches in support of more than 10,000 cases.
“[Palantir] is not actually improving things,” she said. “It’s expanding the power that police have.”
Dozens of California police departments, sheriff’s offices, airport police, universities, and school districts signed onto data-sharing agreements with the LAPD between 2012 and 2017. Contractually, these entities had to send daily copies of their own police records (like warrants and arrests), license plate readings, and dispatch information so that the LAPD could put that data into Palantir. The Los Angeles School Police Department, Compton Unified School District Police Department, El Camino College, Cal Poly University Police Department, and California State University all signed these agreements.
For activists like Jamie Garcia, an organizer with the advocacy group Stop LAPD Spying Coalition, the LAPD’s well-established history of over-policing Black and brown communities means the department simply shouldn’t be trusted with this kind of technology. “The tool will only keep reflecting that racism,” she told BuzzFeed News.
Jacinta González, an organizer with Latinx advocacy group Mijente, said that Palantir gives police an opportunity to expand their power and reinforce the over-policing of Black and Brown communities.
“[Palantir] is not actually improving things,” she said. “It’s expanding the power that police have. And it’s minimizing the right that communities have to fight back, because many times, the surveillance is done in secretive ways. It’s ridiculous the community doesn’t know what Palantir is doing in their city, and we have to wait until you get FOIA documents to actually understand.”
The police and their critics do share one point of view, however: They both believe Palantir is working as intended.
Obtained by BuzzFeed News through a Freedom of Information Act request, the “Intermediate” and “Advanced” training guides comprise two eight-hour courses explaining to LAPD Crime Intelligence officers how to use Palantir on the job, revealing the granular level of detail the software brings to law enforcement searches. With Palantir, police can search for people by name. But, as you’ll see in the slides below, they can also search by race, gender, gang membership, tattoos, scars, friends, or family.
“Male, White, Peckerwood Gang, Skull Tattoo.”
“Person, Male, Hispanic, Vineland Boys, Rosary Tattoo.”
According to the training manuals, searches like these will return a list of names along with associated home addresses, email addresses, vehicles, warrants, and mugshots. If Palantir finds related surveillance pictures, it will offer those up as well, along with any personal connections it might find — including friends, family members, neighbors, and coworkers. LAPD officers can use this information as leads or to generate lists of people who the system believes are likely to commit a crime in the future.
They can also search by race, gender, gang membership, tattoos, scars, friends, or family.
Also included in the manuals is a walk-through of a suite of visualization functions that can plot on a map a vehicle’s encounters with law enforcement, or chart a gang’s turf using arrest locations, license plate data, and warrant addresses.
Another Palantir user guide, titled “Palantir Mission Control at LAPD,” was obtained by Hamid Khan, an organizer with Stop LAPD Spying. The May 2017 document walks officers through the process of importing crime data, visualizing it on a map, and creating crime map trend charts to analyze that data.
The LAPD told BuzzFeed News that the department doesn’t use this specific crime-mapping tool within Palantir anymore.
Palantir declined comment, citing the “quiet period” before its direct public listing.
Obtained by BuzzFeed News
Created in 2004, Palantir was designed to solve a problem: Information about the same people is spread between multiple databases — and even when it’s all compiled, it can be hard to interpret.
The company’s law enforcement software, Gotham, has also been used by the Northern California Regional Intelligence Center, which gave hundreds of police departments in the state access to Palantir’s powerful, expensive software. And big metropolitan police departments — like those in New Orleans, New York City, and Chicago — have used Palantir. The US Department of Health and Human Services has used its enterprise product, Palantir Foundry, to organize information about the coronavirus pandemic.
Palantir CEO Alexander Karp didn’t major in engineering or computer science. He received a PhD in philosophy from Goethe University in Frankfurt in 2002. In an analysis of his dissertation, Data and Society researcher and Berkman Klein Center for Internet & Society faculty associate Moira Weigel argues that Karp sees big data analytics as an example of “incontestable self-evidence” that treats correlations “simply as given.”
This can be seen in Palantir’s design. When the software maps relationships from person to person, person to car, person to home, or person to crime scene, those relationships are treated as fact and presented without caveat, though they are determined by an algorithm and not guaranteed to be accurate.
Sarah Brayne, a sociologist who embedded with the LAPD for two years to study its use of Palantir, told BuzzFeed News that when a system like this is designed this way, people using it can easily interpret results as evidence that someone may be or may become a criminal.
“If there’s somebody who the cops have been interested in 10 times throughout the course of your life, the idea is basically, where there’s smoke, there’s fire,” Brayne said. “There’s probably a reason that the cops keep being interested in this particular person.”
The LAPD started pursuing data-driven policing strategies as a way to respond to scandal.
Following the Rampart scandal in the late 1990s, which involved corruption, abuse of suspects, and evidence theft in the LAPD’s gang unit, the federal Department of Justice placed the LAPD under a consent decree, which forced the department to institute reforms and comply with federal audits.
So the department tried something new.
“The idea is basically, where there’s smoke, there’s fire.”
In 2002, James Hahn, then the mayor of Los Angeles, brought in Wiliam Bratton to lead the LAPD. Bratton came from the New York Police Department, where he introduced CompStat, a program that mapped crime. Among his tasks at the LAPD were to introduce new data-driven strategies and get the department out from under the consent decree.
Andrew Ferguson, American University law professor and author of The Rise of Big Data Policing, said that at the LAPD, data-driven policing doesn’t just mean using evidence to go after crimes. It means part of the police’s job is to conduct more surveillance in order to get more data.
“Palantir is a data-driven surveillance system more than a data-driven policing system,” Ferguson said, meaning that Palantir helps the police watch people, rather than keep people safe. And by design, it’s infinitely expandable. “It can be a platform for whatever sources of surveillance data they want to bring into the platform.”
The problem with data-driven policing, according to Ferguson, is that it doesn’t solve underlying problems within policing like concerns about misconduct or racism. Although data-driven policing sounds like a push to objectivity, it’s not.
The LAPD has used Palantir for more than a decade — first, as part of an initiative called the Los Angeles Strategic Extraction and Restoration (LASER), which ran from 2009 to 2019. Currently, Palantir is used in a program the department calls Data-Informed, Community-Focused Policing.
Palantir is expensive, and police departments often struggle to afford it. Between 2015 and 2016, the LA Mayor’s Office of Public Safety got the money to pay for Palantir from the federal government — specifically, from the Federal Emergency Management Agency’s Urban Areas Security Initiative (UASI) Program, which gives cities technology to “prevent, protect against, mitigate, respond to, and recover from acts of terrorism.”
It’s unclear if the LAPD always used federal UASI money to pay for Palantir, or if it only used federal money in those years. But purchase orders show that Palantir software upgrades and user licenses cost hundreds of thousands to millions of dollars.
Senator Ron Wyden (D-OR) said that it’s concerning that the LAPD has used federal money in order to finance Palantir.
“The federal government shouldn’t be spending money on unproven surveillance software or crime prediction programs that target Black and Hispanic Americans and don’t actually reduce crime,” he said. “The Justice Department has an obligation to make sure taxpayer money actually makes all of us safer while protecting the rights of every American.”
The LAPD also received federal money to design LASER, although it never directly funded payments to Palantir. Under the Obama administration in 2009, the Bureau of Justice Assistance started the Smart Policing Initiative to provide grants to organize data-driven programs at police departments. The LAPD was one of the first departments to receive a Smart Policing Initiative grant for creating a “place- and offender-based policing” for gun violence. Using that money, the LAPD and contractor Justice and Security Strategies devised LASER.
LASER promised “precision policing.” Police would reduce crime by preemptively targeting the people and areas that presented the highest risk.
It involved two main pieces of software: PredPol and Palantir.
PredPol, a company that makes a predictive policing software, was supposed to help the LAPD identify crime hot spots by looking at time, weather, and location where a crime has been reported in the past. Academics have criticized the company for sending police to the same places where crime has already been reported, rather than identifying areas where crime is underreported. (The LAPD ended its use of PredPol in April, citing the cost.)
Palantir was used to identify crime hot spots by allowing police to load in crime reports, which were displayed on a timeline or a map, showing the concentration of which crimes have been reported where.
For the LAPD, Palantir swallowed up the contents of several major databases that were previously spread across public safety agencies throughout Los Angeles and the state, including incident reports, arrests, citations, license plate reader data, field interviews, recovered vehicles, warrants, booking photos, and data from the California Law Enforcement Telecommunications System, and the county’s Community Health Services data. Palantir algorithmically organizes this data, determining possible links between the people within it (e.g., “Sally was interviewed by police in a report about Fred”) and makes it searchable.
“What’s going on in the current moment is evidence-based policing on steroids.”
Palantir also took in much of the data collected during the LAPD’s own policing activities. Field interviews include information that police collect from civilians, including family members, neighbors, lovers, friends, and coworkers. Their notes can also include descriptive information like race, height, weight, eye color, hair color, scars, nicknames, or tattoos, as well as interpersonal information, such as suspected gang affiliation. In 2014, five years into using Palantir, the LAPD paid $2.9 million to update its existing system to include “Thunderbird,” a software that organizes license plate data, using its Urban Areas Security Initiative Grant money from FEMA.
Palantir also analyzed data obtained from private sources. In one exercise in the Advanced Course training manual, police officers were asked to import telecom data from Verizon. (One file is called “Verizon — Phone Source Destination.”) The software automatically extracted the caller’s phone number, the recipient phone number, the date, duration of the call, and the latitude and longitude of all the cell towers used.
Brayne told BuzzFeed News that technologies like Palantir are part of a larger trend of data-driven policing. In an ideal world, data-driven methods would push police to rely on evidence. But in practice, she said, it often results in officers using the veneer of objectivity to back up their hunches.
“What’s going on in the current moment is evidence-based policing on steroids,” Brayne said. “It’s like, ‘Okay, now let’s rely on fancier stuff like AI or machine learning or predictive algorithms rather than just historical crime data to identify hot spots or heat density maps.’”
Gonzalez said it’s concerning that Palantir, which was designed to be used for military intelligence in war zones, is being handed to local police departments using federal money.
“The federal government does this, knowing that there are not adequate legal protections for people’s information right now,” González said. “You’re seeing a whole parallel system of information gathering… being used to police communities and to criminalize communities.”
According to activist group Stop LAPD Spying, the most controversial aspect of LASER was its “Chronic Offender Bulletins.”
Every day between 2011 and 2019, police officers patrolling one of 40 “LASER Zones” — areas that police determined to have a high risk of crime — used Palantir to rank the top 12 “Chronic Offenders” thought most likely to commit a violent crime. (In 2017, police also had to choose five to ten “backup” people.)
The top twelve Chronic Offenders were determined using a point system. Officers were told to look at arrest reports, investigative reports, and field interview cards. Each violent crime arrest over the past two years was worth five points. Being on parole was worth five points. Gang membership earned five points. An arrest for a violent crime or for being in possession of a handgun was worth five points. “Quality police contacts” — which were not defined, but which Brayne said were probably “gesturing at legal thresholds around reasonable suspicion” — over the past two years were worth one point each.
The LAPD told BuzzFeed News that it no longer creates Chronic Offender bulletins.
“Many folks who end up in the Palantir system are predominantly poor people of color.”
From the training guide, it’s unclear if these people were suspects, gave an interview to police, were associated with a suspect, or just happened to be the area. In the training manuals, police aren’t taught to analyze how those names were extracted either. They simply take the names and move on with their work.
But according to critics like María Vélez, a criminologist at the University of Maryland, the LAPD’s predictive policing overtargeted Black and Latino Angelenos. In Los Angeles, 52 % of the population is white, 49% is Latino, and 9% is Black. Those singled out by the Chronic Offender Bulletins were 53% Latino and 31% Black, according to an LAPD inspector general report.
The LAPD inspector general report argued that since the racial percentages in the Chronic Offender Bulletins roughly correspond to those of arrest rates, it wasn’t evidence of a problem. But Vélez said that the Chronic Offender Bulletins should be compared to the community demographics rather than those of arrests. “That’s disproportionate,” she said.
“The focus of a data-driven surveillance system is to put a lot of innocent people in the system,” Ferguson said. “And that means that many folks who end up in the Palantir system are predominantly poor people of color, and who have already been identified by the gaze of police.”
In the eyes of the LAPD inspector general, LASER’s flaws outweighed its benefits. In a report published in 2019, the office laid out its concerns.
Often, police didn’t have reliable information about who was in a gang and who wasn’t. (This year, California’s Office of the Attorney General found that LAPD officers falsely added dozens of people to the state’s gang database.)
The inspector general also found major inconsistencies in how police selected Chronic Offenders. Five police districts determined Chronic Offenders “based on verbal or informal referrals from field personnel,” the inspector general report says. Two reporting districts “did not use the point system at all.”
Out of the 637 Chronic Offenders identified by the LAPD, 44% were never arrested for a violent or gun-related crime, despite LASER’s attempt to target violent crime. And 10% of Chronic Offenders had zero contacts with police — meaning zero arrests and zero mentions in field interviews. Per LASER’s guidelines, 100% of Chronic Offenders should have already been arrested for a violent crime.
“The database included people who were in custody, who had been arrested for only non-violent crimes and whose points were either not entered or appeared to be over- or under-stated,” the report said.
People might or might not have found out that they were a Chronic Offender. Sometimes the police sent a warning letter, and other times police found the person and spoke to them directly. But there was no easy way for someone to appeal their status as a Chronic Offender or to remove themselves from surveillance. The only way for a person to remove their information from Palantir, according to the LAPD, is to get a court order.
The inspector general report also said that “one of the primary areas that lacked clarity was the overall goal of the [LASER] program itself.” Was the goal to “extract” violent people from the community? To deter crime? What are marks of success? The report could not determine.
It also couldn’t find evidence that LASER reduced crime. The report said that in 6 out of the 13 LASER Zones, violent crime rates were “the same as, or worse than, those for non-LASER Zones.”
Although the department had argued that the program had reduced crime, citing a 23% drop in violent crime in the Newton reporting division in a federal grant application in 2016, the inspector general report and activist pushback were too much a deterrent to continue the program.
And so, in April 2019, the LAPD ended the program. But it didn’t abandon Palantir.
Introduced this year, the Data-Informed, Community-Focused Policing program was supposed to be a more ethical iteration of LASER, but it works largely the same way: Police use Palantir to ingest large amounts of data, identify crime hot spots on map, and create lists of people (largely based on parole data) who might commit a crime in the future.
Brayne said that even LAPD officers don’t know how Data-Informed, Community-Focused Policing is different from LASER. “I am tentatively viewing it as a rhetorical shift rather than a meaningful shift in practice,” she said.
There is one difference: The new plan formally expands the surveillance that began under LASER. Under LASER, the LAPD said people who were arrested for violent or gun-related crimes could be surveilled. Now, people suspected of nonviolent property crimes can also be.
The outline for the initiative also talks about preventing crime through “the physical maintenance or general upkeep of a place.” The idea that police should pursue property crime and property damage as a mode of crime prevention is known as “broken windows” policing, which critics have said pushes police to overreact to nonviolent crimes.
For people like Stop LAPD Spying’s Garcia, these police tactics are more of the same, even if they are packaged as new and scientific.
“All that stuff is very old. It’s not anything new,” Garcia said. “The only thing that’s fairly new about it, that they want to hide and bury, is the use of algorithms.” ●