THE KILLER ROBOT
Ask any bystander on the street what "killer robots" are and you'd probably hear something out of the "Terminator" series. The fear of autonomous machines killing humans is so ingrained in society that some organizations have made it their mission to ban these terrifying creations.
But researchers are questioning whether banning killer robots is worth it. They found the problem isn't killer robots themselves, but how society — particularly military organizations like The Pentagon — are enabling and researching violent robots, according to a new study from the SUNY University at Buffalo.
"Instead of demonizing Killer Robots as such, we need to understand the tools, processes and operating procedures that create, support and validate these objects," the researchers said.
Co-author Tero Karppi, a professor of media study, warned how "we have to deconstruct the term 'killer robot' into smaller cultural techniques."
With its roots in agricultural engineering, cultural techniques in media theory constitutes the examination of actions, ideas, and technologies and how it gave rise to certain concepts, objects, and systems.
DEEPER EXAMINATION
This means we shouldn't just look at what a killer robot is and what it could do, but how we're promoting their creation.
"We need to go back and look at the history of machine learning, pattern recognition and predictive modeling, and how these things are conceived," Karppi said, according to Phys.org.
In a world where robots are moving into the role of combatant, there'll be situations where bots need to distinguish between friend and foe and in the real world — and that distinction risks being blurry.
That's partly why Karppi believes we need to examine how rules and ethics could translate into software code.
"The distinctions between combatant and non-combatant, human and machine, life and death are not drawn by a robot," Karppi said, according to Phys.org. "While it may be the robot that pulls the trigger, the actual operation of pulling is a consequence of a vast chain of operations, processes and calculations."
He basically means the concerns surrounding artificial intelligence should fall on developers, not the technology itself.
Share This Article