Rules of Engagement

Russia: Our Killer Robots Don’t Need Any Pesky International Laws

Needless to say, that's the minority view.
The Russian delegate to a UN conference argued that autonomous killer robots shouldn't be subject to any additional rules or regulations.
Image: Piqsels/Futurism

AI Unleashed

United Nations delegates are currently meeting to debate possible regulations controlling autonomous killer robots — but Russia is having none of it.

The Russian delegate, representing a country that has already developed and deployed military robots in real-world conflicts, remained steadfast that the global community doesn’t need any new rules or regulations to govern the use of killer robots, The Telegraph reports.

That pits Russia against much of the rest of the international community, who are calling for rules to keep humans in charge of the decision to open fire, highlighting on the main anxieties and ethical conundrums surrounding autonomous weaponry.

Asleep at the Wheel

The argument from Russia is that the AI algorithms driving these killer robots are already advanced enough to differentiate friend from foe from civilian, and that therefore there’s no need to burden the autonomous death machines with unnecessary regulations.

“The high level of autonomy of these weapons allows [them] to operate within a dynamic conflict situation and in various environments while maintaining an appropriate level of selectivity and precision,” the delegate said, according to The Telegraph. “As a result, it ensures the compliance with [existing] rules of international humanitarian law.”

Needless to say, that was the minority view during the still-ongoing conference.

Open Fire

Russia certainly isn’t the only country developing autonomous military robots — the US, China, and the UK are all doing the same, to name a few. But still, the other nations seemed to rally against Russia in calling for more safeguards to make sure that a glitchy or poorly-developed robot doesn’t fire at a civilian, or otherwise cause tragedy outside of its intended purposes.

“Humans must apply the rules of international humanitarian law in carrying out attacks so weapons that function in this way complicate that,” International Committee of the Red Cross adviser Neil Davison said on BBC radio, according to The Telegraph. “Our view is that an algorithm shouldn’t decide who lives or dies.”

READ MORE: Killer robots need ‘no new rules’ about firing on humans, Russia tells UN [The Telegraph]

More on killer robots: Autonomous Killer Robot Accused of Attacking Soldiers

Dan Robitzki is a senior reporter for Futurism, where he likes to cover AI, tech ethics, and medicine. He spends his extra time fencing and streaming games from Los Angeles, California.