Police departments across the country are increasingly relying on systems that use sensors and artificial intelligence to keep track of gunshots in certain neighborhoods.

In fact, according to a shocking new investigative report by the Associated Press, 65-year-old Michael Williams was falsely accused of murder last August, and spent a whole year in jail — specifically because of one of these crime-fighting AIs.

The evidence against him? Dubious data from ShotSpotter, a gunshot detection system that's not incredibly expensive to implement — especially considering just how fundamentally flawed it is, according to the AP's report. So flawed, in fact, that after spending a year in jail, the man was freed after the prosecution dropped its case due to (you guessed it) a lack of evidence.

How faulty is it? ShotSpotter can miss live gunfire right under its microphones. It often misclassifies fireworks or cars backfiring as gunshots, too, making it a faulty system that can easily lead to false accusations, and in this case, arrests.

Even worse, ShotSpotter employees often have opted to change the location of where the gunshot sounds came from after the fact, introducing an inherent human bias. The sensors are also often disproportionately set up in Black and Latino communities.

The report found that technology like ShotSpotter hasn't addressed its issues. At all. "The evidence that we’ve produced suggests that the technology does not reduce firearm violence in the long-term, and the implementation of the technology does not lead to increased murder or weapons related arrests," Mitch Doucette, lead author of a recently published study that looked at gun violence in 68 metropolitan counties between 1999 and 2016, told the AP.

ShotSpotter is actively trying to mask the way its underlying technology functions, and isn't allowing investigators and lawyers to interrogate its inner-workings.

"We have a constitutional right to confront all witnesses and evidence against us, but in this case the ShotSpotter system is the accuser, and there is no way to determine if it’s accurate, monitored, calibrated or if someone’s added something," Katie Higgins, a defense attorney, told the AP. "The most serious consequence is being convicted of a crime you didn’t commit using this as evidence."

The technology has also never been vetted or peer-reviewed by experts, something that fundamentally undermines its validity or reliability.

ShotSpotter argues that it is constantly fine-tuning its system and backing up its claims by asking patrol officers to investigate potential crime scenes.

But the simple fact that the company's employees have been found to manually change the location of where the gunshots were coming from raises plenty of red flags.

It's a troubling example of police departments over-relying on inherently flawed technologies, that not only cost police departments (and taxpayers) millions of dollars, but can lead to routine miscarriages of justice, too. The dream of crime-fighting AI is a great one, but as it stands, the reality remains slightly nightmarish for those who end up in its crosshairs.

READ MORE: How AI-powered tech landed man in jail with scant evidence [Associated Press]

More on policing: Grandfather Dies of Heart Attack After Minors SWAT His Home


Share This Article