On Tuesday, New York Gov. Andrew Cuomo announced a moratorium on facial recognition and biometric technology from public and private schools in the state until at least July 2022. One of the few districts in the country to use facial recognition in its schools, Lockport City School District, will comply with the facial recognition moratorium, the Lockport Journal reported.
That doesn’t appear to be the first choice of the administrators of the western New York district. Documents obtained by BuzzFeed News via public record request show that they argued in a slide presentation that when it comes to facial recognition, “history is on our side.”
It’s unclear who gave the presentation, where, or when. But it has a clear pro-surveillance message.
“We may have to have thick skin to continue to take the bashing of misinformation and being called irresponsible and incapable of making a good decision regarding technology we could not possibly understand, but history shows we are in good company,” a slide presentation prepared by the Lockport City School District says. “It is very easy to say something is not good, dangerous, or irresponsible when we do nothing to actually find out the truth.”
In the wake of the 2018 shooting at Marjory Stoneman Douglas High School, Lockport announced it would use Aegis, a facial recognition system developed by biometric technology company SNTech. With the ever-present threat of shootings at in-person schools, some parents and teachers have argued that technology is the only way to protect students — whether it’s software that surveils every finger stroke a child types or schools built like fortresses.
For its part, the school district that serves just under 4,500 students near Niagara Falls became one of the first in the country to use facial recognition. Aegis was adopted as part of a $3.8 million security enhancement project affecting six elementary schools and three intermediate and high schools. The money also went toward the installation of security cameras, panic buttons, and door control technology.
A school district spokesperson told BuzzFeed News that even though the district would comply with the new law, it would prefer to continue to use facial recognition. “The district continues to believe that its students, staff, and visitors should not be deprived of the additional layer of security provided by the district’s facial recognition system,” a Lockport City School District spokesperson told BuzzFeed News. “Nonetheless, the district will, of course, comply with applicable law.”
In the slide presentation, titled “Aegis Security: Opportunity for Discussion,” the district addresses concerns about facial recognition. For instance, it claims that Aegis is accurate across various categories of gender and race. This was likely meant to address concerns that facial recognition has proven to be less accurate when used to identify women or people of color. The presentation uses Aegis’s accurate identifications of actors Lupita Nyong’o and the late Chadwick Boseman as examples.
“The system uses biometric measurements and neuroscience to identify shapes and patterns,” the slide presentation says. “It does not see male, female, black, white, hispanic, etc., numbers and mathematical equations determine a match.”
Matthew Guariglia, a policy analyst for the Electronic Frontier Foundation, told BuzzFeed News that this argument falls short.
“This is a fig leaf that we hear all the time whenever we have a technology that disproportionately impacts vulnerable communities, which is that it’s not biased. It’s just math. How can math be based?” he said. “But the truth is that this technology is designed by people and trained with databases or by people.”
In one of two slides titled “History Is On Our Side,” school administrators compare concerns about facial recognition to fear and mistrust of telephones, writing, books, radio, cinema, and computers. “To dismiss what we do not understand without proper evaluation because of fear, suspicion, laziness, where an ulterior motive, has proven to be folly throughout history.”
Guariglia was also critical of this argument. “History is just as full of technologies that were used, decided that they were harmful or invasive, and jettisoned,” he said. “You’re more likely to put facial recognition in line with the use of lead paint than the use of the radio.”
In the slide presentation, school administrators say that Aegis “scans 4 million images a second,” “identifies guns and faces,” and “requires confirmation by a human monitor for every security alert.” Using object recognition, Aegis claims to find “the most common firearms” and alerts school administrators. If a human were to confirm the presence of a gun, the system would automatically alert the police and put the school into lockdown.
The facial recognition works by matching security camera footage to a database kept by the district, which includes sex offenders, “unauthorized staff,” suspended students, and other threatening people.
The presentation says that Aegis does not “track students, staff, or visitors,” store information about individuals “who are not identified as a threat,” detect “concealed” weapons, or share information with third-party vendors.
A separate privacy document says images wouldn’t be stored longer than 60 days, unless part of an investigation.
BuzzFeed News also obtained a separate slide presentation prepared by SNTech dated August 2019. It’s unclear who specifically created the presentation, and whom at the Lockport City School District it was presented to. The presentation says, “We insist that our system is on all cameras,” and that it checks images at a rate of 13 frames per second. When the system does make a match, it does so correctly 99.96% of the time, per the National Institute of Standards and Technology’s facial recognition accuracy test. NIST did not immediately respond to BuzzFeed News’ request for comment.
The state moratorium on facial recognition requires a study about the effectiveness and accuracy of facial recognition. Guariglia noted that any facial recognition system is only as good as the people who use it, and the data it’s trained on. “This technology can only be as unbiased as the system in which it operates,” he said. “And the punitive system of America is racially biased.”