Gizmag describes research by Ronald Arkin and Alan Wagner in which robots are taught to deceive.
What it all boiled down to was a series of 20 hide-and-seek experiments. The autonomous hiding/deceiving robot could randomly choose one of three hiding spots, and would have no choice but to knock over one of three paths of colored markers to get there. The seeking robot could then, presumably, find the hiding robot by identifying which path of markers was knocked down. Sounds easy, except that sneaky, conniving hiding robot would turn around after knocking down one path of markers, and go hide in one of the other spots.
In 75 percent of the trials, the hiding robot succeeded in evading the seeking robot. In the other 25 percent, it wasn’t able to knock down the right markers necessary to produce its desired deception. The full results of the Georgia Tech experiment were recently published in the International Journal of Social Robotics.
The full research article is title, Acting Deceptively:Providing Robots with the Capacity for Deception.
Abstract Deception is utilized by a variety of intelligent systems ranging from insects to human beings. It has been argued that the use of deception is an indicator of theory of mind (Cheney and Seyfarth in Baboon Metaphysics: The Evolution of a Social Mind, 2008) and of social intelligence (Hauser in Proc. Natl. Acad. Sci. 89:12137–12139, 1992). We use interdependence theory and game theory to explore the phenomena of deception from the perspective of robotics, and to develop an algorithm which allows an artificially intelligent system to determine if deception is warranted in a social situation. Using techniques introduced in Wagner (Proceedings of the 4th International Conference on Human-Robot Interaction (HRI 2009), 2009), we present an algorithm that bases a robot’s deceptive action selection on its model of the individual it’s attempting to deceive. Simulation and robot experiments using these algorithms which investigate the nature of deception itself are discussed.
No comments:
Post a Comment