Thursday, July 30, 2009

Three Human-Centric Laws for Robots


David Woods of Ohio State and Robin Murphy from Texas A&M have proposed a new version of the three laws for robots. The laws first appear in an article titled, Below Asimov: The Three Laws of Responsible Robotics, which appears in the july/August issue of IEEE Intelligent Systems. The news laws suggested by Murphy and Woods are:
A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.

A robot must respond to humans as appropriate for their roles.

A robot must be endowed with sufficient situated autonomy to protect its own existence as long as such protection provides smooth transfer of control which does not conflict with the First and Second Laws.


David Wood is quoted discussing the new laws in an article titled,
Want Responsible Robotics? Start With Responsible Humans

“Robots exist in an open world where you can’t predict everything that’s going to happen. The robot has to have some autonomy in order to act and react in a real situation. It needs to make decisions to protect itself, but it also needs to transfer control to humans when appropriate. You don’t want a robot to drive off a ledge, for instance -- unless a human needs the robot to drive off the ledge. When those situations happen, you need to have smooth transfer of control from the robot to the appropriate human,” Woods said.

“The bottom line is, robots need to be responsive and resilient. They have to be able to protect themselves and also smoothly transfer control to humans when necessary.”

Woods admits that one thing is missing from the new laws: the romance of Asimov’s fiction -- the idea of a perfect, moral robot that sets engineers’ hearts fluttering.

No comments: