Security Companies Want To Use Facial Recognition To Stop School Shootings
More surveillance should never be the answer, especially in schools.
It’s a heartbreaking reality affecting more than 50 million American students: Senseless acts of violence are more than a weekly occurrence at American schools.
To help allay people’s fears, private security companies are marketing unproven technologies to schools with the intention of preventing school shootings.
The most recent example, detailed by the Washington Post: cameras equipped withy facial recognition software, positioned inside and outside of schools. Tech companies emphasize this system would keep a closer eye than ever on children, and flag those who act in a way that might suggest they pose a danger to others.
It may be intrusive, but using facial recognition tech without consent is perfectly legal. As the Washington Post notes, only Illinois and Texas have laws requiring companies to get consent before scanning faces using facial recognition.
Many schools across the nation have already invested millions of dollars in the technology, equipping their hallways and classrooms with high-tech camera systems, or upgrading their existing closed-circuit cameras with AI software.
Unfortunately, there are plenty of legitimate concerns over the effectiveness of current day facial recognition technology. Just a few:
- A lot of security cameras lack the video quality to provide an AI software with a picture clear enough to distinguish between faces and object, for example.
- Because technology’s rapid advancements are far outpacing the law, security companies basically have free reign over data collection methods, and how and where that data is stored.
- Predicting future face shapes, and structures of still-growing children requires highly complex calculations that the software might not be capable of.
- Facial recognition software has been repeatedly shown to be less accurate for people of color. In a school setting, that might mean that students of color are more heavily disciplined than their white classmates.
- Even in the case of a true positive, the system may not alert people quickly enough to intervene in time to prevent another violent attack.
The U.S. isn’t the only place where facial recognition monitors students. A high school in China — a country where sophisticated face scanners are being used to catch and identify jaywalkers — is using facial recognition replace ID cards, monitor students’ whereabouts, and to survey their behavior and emotional states. A business school in Paris is planning to use facial analysis to ensure that students are paying attention in class.
Let’s face it, facial recognition and more surveillance for students isn’t the best way to prevent violent attacks. At best, the system is merely intrusive; at worst, it could become a tool of abuse, amplifying injustices already present in schools.
Some people are trying to do better. In the months since the deadly school shooting in Parkland, Florida, lots of people have been suggesting new ways American institutions could begin to address it, like offering more mental health services, regulating the sale of fire arms, or, you know, study the problem. More surveillance is not replacement for these important steps. And, luckily, greater awareness means we might be moving closer to accomplishing them.