Are the Three Laws of Robotics due for an update?
The Three Laws of Robotics, penned by sci-fi author Isaac Asimov in 1942, have shaped attitudes toward real-world robots to a striking extent:
- Robots must not harm humans.
- Robots must obey humans.
- Robots must protect themselves.
But nearly 80 years later, AI law expert and Brooklyn Law School professor Frank Pasquale argues in a new book, we're due for some new ones.
Pasquale lays out four new laws of robotics, which he told OneZero are designed to account for the ways we actually interact with robots in the year 2020 — like workplace automation.
Pasquale explains that his four new laws are meant to protect humanity from robots and AI in more realistic ways than Asimov envisioned. Think less "I, Robot" and more how AI could replace human workers or pit humanity against itself. With new laws, he argues, AI can better serve humanity instead of being used for whatever it is that Silicon Valley engineers decide to do with it behind closed doors.
His suggestions are as follows, as summarized by OneZero:
- Digital technologies ought to "complement professionals, not replace them."
- AI and robotic systems "should not counterfeit humanity."
- AI should be prevented from intensifying "zero-sum arms races."
- Robotic and AI systems need to be forced to "indicate the identity of their creators(s), controller(s), and owners(s)."
Notably, these laws are also directed at the people who develop or implement AI rather than robots themselves, since they're the ones who have the power to actually hire and fire human workers.
"I think the big dream of a lot of folks in AI is presumably just letting it take on the job of a doctor, nurse, journalist, teacher, and so on," Pasquale told OneZero. "And my idea is that’s really not the goal we should be going for, right?"
"For me, the proper role of a lot of these technology fields is to complement and support professionals, not replace them," he added.
READ MORE: Asimov’s Three Laws Helped Shape A.I. and Robotics. We Need Four More. [OneZero]
Share This Article