Slaughterbots

A new short film illustrating the prospect of military drones has been commissioned for an event at the United Nations Convention on Conventional Weapons, which is being hosted by the Campaign to Stop Killer Robots.

The film presents a fictionalized scenario in which a tech company showcases and deploys its latest combat drone, which is capable of distinguishing the good guys from the bad guys. A montage of mock new reports illustrates what happens next, when the device's true abilities are revealed and the machines begin killing off politicians and activists.

Stuart Russell, an artificial intelligence (AI) scientist at the University of California in Berkeley, is part of the group that will show the film to attendees. He has stated that the technology depicted in the film already exists, and it would actually be much easier to implement than self-driving vehicles.

Military drones are nothing new, having been used for reconnaissance missions as well as attacks. However, they have largely been operated by human pilots via remote control, whereas we're now in a position to outfit these machines with automated targeting systems. This advancement would allow them to execute missions autonomously.

This situation is troubling enough in its own right, but there are also concerns about the potential for widespread proliferation. These drones could be manufactured en masse for a relatively small amount of money – and they could be used to enact the unthinkable if they were to fall into the wrong hands.

Ban These Bots

The Campaign to Stop Killer Robots hopes to convince international authorities to establish a treaty that bans autonomous weapons. This would outlaw the large-scale manufacture of such machines, and apply oversight to any nation choosing to explore the technology.

“Pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security,” argued Russell, according to a report from The Guardian. This line of thinking has been compared to the approach that prompted the Biological Weapons Convention.

As the underlying technology that facilitates this kind of weaponry has progressed, experts have realized the need to appeal to lawmakers. And, while calls for legislation have been made for years, there has been a serious increase in activity over the course of 2017.

In August, Elon Musk led a host of prominent A.I. experts in signing an open letter that outlined the dangers of autonomous weapons. In November, scores of experts reached out to the Australian and Canadian heads of state, urging them to take action.

This technology is very real – and if we wait too long to regulate, it might be impossible to close Pandora's box.


Share This Article