It's basically unusable — and potentially dangerous.

Saw It Coming

The UK government has been funneling millions of dollars into a prediction tool for violent crime that uses artificial intelligence. Now, officials are finally ready to admit that it has one big flaw: It's completely unusable. Also: Predictably (ironically), riddled with ethical problems, as Wired reports.

Police have already stopped developing the system called "Most Serious Violence" (MSV), part of the UK's National Data Analytics Solution (NDAS) project, and luckily was never actually put to use — yet plenty of questions about the system remain.

The tool worked by assigning people scores based on how likely they were to commit a gun or knife crime within the next two years.

Two databases from two different UK police departments were used to train the system, including crime and custody records.

Fatal Flaws

Fortunately, the system never got off the ground. The system was full of flaws according to official documents obtained by Wired. "A coding error was found in the definition of the training data set which has rendered the current problem statement of MSV unviable," a March briefing read.

"It has proven unfeasible with data currently available to identify a point of intervention before a person commits their first MSV offense with a gun or knife with any degree of precision."

Instead of predictability scores in the high 70s, as was found in early tests, the system was only able to predict gun or knife violence less than 20 percent of the time.

Biases

Luckily, police forces have decided to discontinue the development of the system. Yet, as experts tell Wired, even with a 100 percent accuracy, plenty of biases would remain in similar systems in the future, especially with regards to age and ethnicity.

READ MORE: A British AI Tool to Predict Violent Crime Is Too Flawed to Use [Wired]

More on AI: Scientists Are Using an “AI Bird Watcher” to Solve a Solar Farm Bird Massacre Mystery


Share This Article