The Los Angeles Police Department was recently forced to release documents about their predictive policing and surveillance algorithms, thanks to a lawsuit from the Stop LAPD Spying Coalition (which turned the documents over to In Justice Today). And what do you think the documents have to say?

If you guessed "evidence that policing algorithms, which require officers to keep a checklist of (and keep an eye on) 12 people deemed most likely to commit a crime, are continuing to propagate a vicious cycle of disproportionately high arrests of black Angelinos, as well as other racial minorities," you guessed correctly.

Algorithms, no matter how sophisticated, are only as good as the information that’s provided to them. So when you feed an AI data from a city where there's a problem of demonstrably, mathematically racist over-policing of neighborhoods with concentrations of people of color, and then have it tell you who the police should be monitoring, the result will only be as great as the process. And the process? Not so great!

The software assigns people points, based on past offenses, or even just having been stopped by (or in contact with) police. Points are an indication, according to the LAPD, of how likely someone is to commit a crime. Relying on this algorithm to determine who ought to be monitored, which can be done through probation or warrant checks, prompts the instruction to police to check up on people — not to arrest them, but to see if there’s any reason to do so. Those targeted by the algorithm and flagged as people of interest can face extra police surveillance, just because they may be associated with or know of criminal activity. This can happen even if they haven’t done anything wrong, ever, in their entire lives.

[And, thanks to social science, we know what happens when you systemically make people feel like criminals, before they've even committed a crime! And it's not good.]

Also: Targeting of this nature, which the police have suggested is related to the ever-decreasing rate of homicide in LA, can add extra hurdles for people who may have been arrested or convicted of a crime in the past but are just trying to get on with their lives. With how the LAPD algorithm operates, those people may remain persons of interest and subject to increased surveillance for up to two years since their last contact with the police.

But those police contacts are often unavoidable, such as one case where one person was stopped by police twice per day on four separate days over six weeks. Each of those interactions resets the clock.

There’s no indication that the LAPD deliberately set out to use an overtly-discriminatory algorithm. But this program, which may have started as a glossed-up database of people associated with gangs, Can't be expected to perform fairly when it’s based on police activity that’s definitely subject to bias and prejudice.

Put another way: The only possible way to read this Twitter thread and not think "Minority Report" are to have never seen Minority Report, or to be the algorithm, itself.


Share This Article