Data scientists who have already been tracking political violence around the world are retooling their efforts to predict American insurrection — but the same tech could also be used for nefarious ends.
As The Washington Post reports, a cottage industry has sprung up around predicting political violence on American soil ever since supporters of then-outgoing President Donald Trump stormed the US Capitol on January 6, 2021 in what many consider an attempt to overthrow the results of the 2020 presidential election.
The machine learning behind this type of predictive modeling has already existed for a while, the Post notes, but has often been focused on countries like the Ukraine or Turkey, where coup attempts and general political unrest are more commonplace.
By marrying historical data with information on everything from inclement weather to economic shifts and even disruptions in transit patterns, this type of modeling relies on the notion that warning signs will present themselves — and that if artificial intelligence algorithms can be trained to recognize them, researchers may be able to act as canaries in the coal mine to head off events like January 6.
So far, two of the organizations profiled by the Post — the University of Central Florida-based CoupCast and the Armed Conflict Location & Event Data Project (ACLED) nonprofit — have had a pretty good track record, with the latter warning in October 2020 that there was an increased risk of attack on a federal building.
The US government seems to have noticed this type of AI's usefulness, with the Pentagon, the CIA, and the State Department already using AI to predict political upheaval overseas, the report notes. The Department of Homeland Security and the FBI, however, have not — a notable exception given that they're two of the most important agencies tasked with monitoring domestic terrorism.
Naturally, any sort of political surveillance that is being eyed by governments is going to raise ethical concerns. Jonathan Bellish, who acts as the executive director of CoupCast's former parent group One Earth Future, intimated to WaPo that he's concerned that these kinds of tools could be used to quash peaceful protest. And Jonathan Powell, an assistant professor who works on the project at its current UCF home, told the paper that such possibilities present "a real and scary concern."
And then, of course, there's the pesky fact that human behavior cannot, for now, be reliably predicted by computers.
While it's compelling to consider the applications for AI that could guess where and when the next coup attempt might happen, be it in countries with higher rates of political violence or in ostensibly more "stable" countries in North America and Europe, there's still lots of kinks to work out — especially considering the fact that uprisings like the storming of the US Capitol are generally outside the realm of predictable American behavior.
Predictive AI is clearly having a moment, but data scientists and governments failing to learn the lessons presented by sci-fi classics like "Minority Report" — and overestimating how smart their AIs really are — isn't cause to celebrate.
READ MORE: The battle to prevent another Jan. 6 features a new weapon: The algorithm [The Washington Post]
More on predictive machine learning: Minority Report-Style Crime-Predicting AI Predictably Sucks at Its Job