Wednesday, December 16, 2009

Guilty Robots make it into the NYT Magazine's Year in Ideas

You know a subject has come of age when it is featured among the 'year in ideas' issue of the NYTimes Magazine, which comes out each December. Under the title, Guilty Robots, Dara Kerr writes:

[I]magine robots that obey injunctions like Immanuel Kant’s categorical imperative — acting rationally and with a sense of moral duty. This July, the roboticist Ronald Arkin of Georgia Tech finished a three-year project with the U.S. Army designing prototype software for autonomous ethical robots. He maintains that in limited situations, like countersniper operations or storming buildings, the software will actually allow robots to outperform humans from an ethical perspective. “I believe these systems will have more information available to them than any human soldier could possibly process and manage at a given point in time and thus be able to make better informed decisions,” he says.
The software consists of what Arkin calls “ethical architecture,” which is based on international laws of war and rules of engagement. The robots' behavior is literally governed by these laws. For example, in one hypothetical situation, a robot aims at enemy soldiers, but then doesn't fire–because the soldiers are attending a funeral in a cemetery and fighting would violate international law.
But being an ethical robot involves more than just following rules. These machines will also have something akin to emotions–in particular, guilt. After considering several moral emotions like remorse, compassion and shame, Arkin decided to focus on modeling guilt because it can be used to condemn specific behavior and generate constructive change. While fighting, his robots assess battlefield damage and then use algorithms to calculate the appropriate level of guilt. If the damage includes noncombatant casualties or harm to civilian property, for instance, their guilt level increases. As the level grows, the robots may choose weapons with less risk of collateral damage or may refuse to fight altogether.

1 comment:

Kimberly said...

What could be scarier than waking up in a menstruation
hospital with a giant teddy bear robot nurse at your bedside? Perhaps a giant Hello Kitty robot nurse. But I digress.