A government inquiry into federal agencies’ deployment of facial recognition may have overlooked some organizations’ use of popular biometric identification software Clearview AI, calling into question whether authorities can understand the extent to which the emerging technology has been used by taxpayer-funded entities.
In a 92-page report published by the Government Accountability Office on Tuesday, five agencies — the US Capitol Police, the US Probation Office, the Pentagon Force Protection Agency, Transportation Security Administration, and the Criminal Investigation Division at the Internal Revenue Service — said they didn’t use Clearview AI between April 2018 and March 2020. This, however, contradicts internal Clearview data previously reviewed by BuzzFeed News.
In April, BuzzFeed News revealed that those five agencies were among more than 1,800 US taxpayer-funded entities that had employees who tried or used Clearview AI, based on internal company data. As part of that story, BuzzFeed News published a searchable table disclosing all the federal, state, and city government organizations whose employees are listed in the data as having used the facial recognition software as of February 2020.
While the GAO was tasked with “review[ing] federal law enforcement use of facial recognition technology,” the discrepancies between the report, which was based on survey responses and BuzzFeed News’ past reporting, suggest that even the US government may not be equipped to track how its own agencies access to surveillance tools like Clearview.
Os Keyes, a PhD candidate at the University of Washington who has researched the politics of AI systems, said the discrepancy between the GAO report and BuzzFeed News’ reporting “highlights the limits of the GAO, and who has power here.”
“I think it speaks to the fact that the GAO analysis … is ultimately playing catchup, and in a domain where … people are not documenting the technologies they use, the regulations they put around them, or the processes for accessing them,” they said.
If you have information about Clearview AI, or other facial recognition technology used by law enforcement, please email us at tips@buzzfeed.com. Or, to reach us securely, see this page.
Clearview AI — a relatively new software whose database was built upon an estimated 3 billion photos scraped from websites and social media platforms — has spent the last two years quietly distributing its software to individual employees within police departments, prosecutors’ offices, health departments, universities, and other organizations by offering free trials with unlimited searches. The strategy has been successful, enabling the company to forge relationships with taxpayer-funded agencies — sometimes without permission or oversight from their leadership — with the goal of leading to paid contracts.
In the spring, BuzzFeed News reached out to 1,803 US taxpayer-funded entities —including the US Capitol Police, the IRS, various US Probation offices, the PFPA, and TSA — to inquire about whether their employees used Clearview AI. In the GAO report, those five organizations admitted to using some type of facial recognition software in their work, but not Clearview AI.
The GAO report surveyed 42 federal agencies in total, 20 of which reported that they either owned their own facial recognition system or used one developed by a third party between April 2018 and March 2020. Ten federal agencies — including Immigration and Customs Enforcement and Customs and Border Protection — said they specifically used Clearview AI.
In April, when asked about their use of Clearview, both the Pentagon Force Protection Agency (1–5 searches) and IRS Criminal Investigation Division (101–500 searches) admitted that it was possible some of their employees may have used or tested Clearview. “We are aware that the data you describe shows a small number of instances where persons within or related to IRS-CI have used [Clearview AI] in a trial manner,” IRS spokesperson Justin Cole said in April.
It’s unclear why the IRS and PFPA responded differently to the GAO survey. Spokespeople for the IRS and PFPA did not respond to email requests for comment.
Reached for comment, a GAO spokesperson did not explain the discrepancy between BuzzFeed News’ reporting and the office’s findings, but noted that agencies self-reported their use of facial recognition tools.
“Selected agencies self-reported about their use of [facial recognition technology] and had the opportunity to update or confirm the information at a number of points prior to report publication,” a spokesperson said. “Our report presents each agency’s self-reported information acquired during our review.”
When reached by BuzzFeed News in the spring, the US Capitol Police (51–100 searches); TSA (101–500 searches); and the US Probation Offices from the District of Utah (11–50 searches), the Eastern District of New York (11–50 searches), and the Western District of North Carolina (11–50 searches) did not respond to BuzzFeed News’ requests for comment. The US Probation Office for the Middle District of Florida said it doesn’t know about any of its workers using the software, though it’s listed in the data as having run more than 51 Clearview searches.
While the GAO survey about Clearview AI use focuses on the period between April 2018 and March 2020, it did note that the US Capitol Police used Clearview following the US Capitol insurrection on Jan. 6, 2021, “to help generate investigative leads.”
The GAO report also says that six federally funded agencies used some type of facial recognition technology on Black Lives Matter protesters. All six of those agencies — the Bureau of Alcohol, Tobacco, Firearms, and Explosives, US Capitol Police, FBI, US Marshals Service, US Park Police, and US Postal Inspection Service — have used Clearview AI, per reporting and data reviewed by BuzzFeed News. In comparison, just three federally funded agencies said they used facial recognition technology to find suspected Capitol insurrectionists.
The GAO report says that information about “the extent” to which agencies used Clearview was omitted due to its sensitive nature. But excluding information about how exactly Clearview was used wouldn’t affect yes-or-no answers agencies gave about whether they had used Clearview.
“We should be clear, this is not a full audit and this is not a true independent review,” Albert Fox Cahn, the executive director of advocacy group Surveillance Technology Oversight Project, said about the GAO report. “With the caveats that these agencies are giving, they have a lot of flexibility to hide what’s really going on.”
“They have a lot of flexibility to hide what’s really going on.”
The GAO found that many federal agencies that use third-party facial recognition technology don’t have a way of tracking or auditing how their employees may be using the tools. Of the 14 agencies that said they used nonfederal facial recognition tools, 13 — including the US Capitol Police and the IRS — don’t have a mechanism to audit which third-party facial recognition systems their workers are using. Only one, ICE, said it had a tracking mechanism.
Some of the agencies that said they don’t have a way of tracking third-party facial recognition software have also used Clearview, according to reporting and data reviewed by BuzzFeed News and their responses to the GAO survey. These agencies include CBP, the FBI, the Drug Enforcement Administration, and the US Marshals Service.
As part of its report, the GAO made 26 recommendations to various federal agencies that included requests to implement mechanisms “to track what non-federal systems with facial recognition technology are used by employees.”
Kate Ruane, a senior legislative counsel for the American Civil Liberties Union, told BuzzFeed News that companies like Clearview highlight the need to end government use of facial recognition.
“Biden and Congress need to take immediate action to identify how and where dangerous face recognition is being used by the government and halt its use,” she said. ●