It's not too late.

Rise of the Slaughterbots

Killer drones — or "slaughterbots" — are already conducting airstrikes without any humans involved in the decision making, according to a recent UN report. Again, not the piloting, the decision making. Computers are deciding who to drone strike.

And that should have us really worried, a group of researchers argue in a guest post for IEEE Spectrum. "In so many words, the red line of autonomous targeting of humans has now been crossed," the team writes.

The use of lethal autonomous weapon systems, according to them, should immediately be ceased. Nations around the world should sign a treaty to make sure these killer robots will never be used again.

To send the message home, the Future of Life Institute, a non-profit focused on educating the world of the risks of AI and nuclear weapons, put together a video, released back in 2017, co-signed by the authors of the IEEE Spectrum post.

"Beyond the moral issue of handing over decisions over life and death to algorithms, the video pointed out that autonomous weapons will, inevitably, turn into weapons of mass destruction, precisely because they require no human supervision and can therefore be deployed in vast numbers," wrote the team, which is made up of computer science and physics professors.

There's been some movement from the international community to push for an end to autonomous weapons systems, including a statement from the International Committee of the Red Cross calling for "a prohibition on autonomous weapon systems that are designed or used to apply force against persons."

The UN report publicly released back in March outlined the use of Turkish-made STM Kargu-2 drones that conducted airstrikes in Libya without any human intervention. The 14 pound drone can be mass produced and has the ability to target victims using facial recognition software autonomously.

According to the report, drone systems like it "were programmed to attack targets without requiring data connectivity between the operator and the munition."

It's not too late, the researchers argue, to come together as an international community and put an end to this extremely dark turn.

"We want nothing more than for our 'Slaughterbots' video to become merely a historical reminder of a horrendous path not taken," the team wrote in their post. "A mistake the human race could have made, but didn’t."

READ MORE: Lethal Autonomous Weapons Exist; They Must Be Banned [IEEE Spectrum]

More on killer robots: Autonomous Killer Robot Accused of Attacking Soldiers


Share This Article