The 19 December 2008 issue of Science magazine has a commentary piece by Noel Sharkey on "The Ethical Frontiers of Robotics", in which Sharkey emphasizes the technical difficulties and the psychological unknowns of battlefield and child- or elder- care robots.
The full article can be read at the Science website (requires subscription).
The Ethical Frontiers of Robotics
Noel Sharkey
Robots have been used in laboratories and factories for many years, but their uses are changing fast. Since the turn of the century, sales of professional and personal service robots have risen sharply and are estimated to total ~5.5 million in 2008. This number, which far outstrips the 1 million operational industrial robots on the planet, is estimated to reach 11.5 million by 2011 (1). Service robots are good at dull, dangerous, and dirty work, such as cleaning sewers or windows and performing domestic duties in the home. They harvest fruit, pump gasoline, assist doctors and surgeons, dispose of bombs, and even entertain us. Yet the use of service robots poses unanticipated risks and ethical problems. Two main areas of potential ethical risk are considered here: the care of children and the elderly, and the development of autonomous robot weapons by the military.
The widespread availability of service robots has resulted from several developments that allowed robots to become mobile, interactive machines. Artificial intelligence has not met its early promise of truly intelligent machines, but researchers in the emerging field of human-robot interaction have implemented artificial intelligence techniques for the expression of emotion, language interaction, speech perception, and face recognition (2, 3).
Sophisticated control algorithms have been developed (4) and have been combined with advances in sensor technology, nanotechnology, materials science, mechanical engineering, and high-speed miniaturized computing. With the prices of robot manufacture falling--robots were 80% cheaper in 2006 than they were in 1990--service robots are set to enter our lives in unprecedented numbers.
In the area of personal-care robots, Japanese and South Korean companies have developed child-minding robots that have facilities for video-game playing, conducting verbal quiz games, speech recognition, face recognition, and conversation. Mobility and semiautonomous function are ideal for visual and auditory monitoring; radio-frequency identification tags provide alerts when children move out of range. The robots can be controlled by mobile phone or from a window on a PC that allows input from camera "eyes" and remote talking from caregivers.
Research on child-minding robots in the United States (5) using the Sony Qurio and large-scale testing by NEC in Japan with their PaPeRo have demonstrated close bonding and attachment by children, who, in most cases, prefer a robot to a teddy bear. Short-term exposure can provide an enjoyable and entertaining experience that creates interest and curiosity. In the same way, television and computer games may be used by parents as an entertainment or distraction for short periods. They do not provide care and the children still need human attention. However, because of the physical safety that robot minders provide, children could be left without human contact for many hours a day or perhaps for several days, and the possible psychological impact of the varying degrees of social isolation on development is unknown.
What would happen if a parent were to leave a child in the safe hands of a future robot caregiver almost exclusively? The truth is that we do not know what the effects of the long-term exposure of infants would be. We cannot conduct controlled experiments on children to find out the consequences of long-term bonding with a robot, but we can get some indication from early psychological work on maternal deprivation and attachment. Studies of early development in monkeys have shown that severe social dysfunction occurs in infant animals allowed to develop attachments only to inanimate surrogates (6).
Despite these potential problems, no international or national legislation or policy guidelines exist except in terms of negligence, which has not yet been tested in court for robot surrogates and may be difficult to prove in the home (relative to cases of physical abuse). There is no guidance from any international Nanny code of ethics, nor even from the U.N. Convention on the Rights of Children (7) except by inference. There is a vital need for public discussion to decide the limits of robot use before the industry and busy parents make the decision themselves.
At the other end of the age spectrum, the relative increase in many countries in the population of the elderly relative to available younger caregivers has spurred the development of sophisticated elder-care robots. Examples include the Secom "My Spoon" automatic feeding robot, the Sanyo electric bathtub robot that automatically washes and rinses, and the Mitsubishi Wakamura robot for monitoring, delivering messages, and reminding about medicine. These robots can help the elderly to maintain independence in their own homes (8), but their presence could lead to the risk of leaving the elderly in the exclusive care of machines. The elderly need the human contact that is often only provided by caregivers and people performing day-to-day tasks for them (9).
Robot companions such as Paro the seal are marketed as pets because they are soft and cuddly and are designed to imitate some of the features of pets, such as purring when touched--they are exploiting human zoomorphism. They are being touted as a solution to the contact problem, but these are still toys that do not alleviate elder isolation, even if they may relieve some of the guilt felt by relatives or society in general about this problem. The success of these robots may stem from people being systematically deluded about the real nature of their relation to the devices (10, 11).
A different set of ethical issues is raised by the use of robots in military applications. Coalition military forces in Iraq and Afghanistan have deployed more than 5000 mobile robots. Most are used for surveillance or bomb disposal, but some, like the Talon SWORD and MAARS, are heavily armed for use in combat, although there have been no reports of lethality yet. The semiautonomous unmanned combat air vehicles, such as the MQ1 Predator and MQ9 Reapers, carry Hellfire missiles and bombs that have been involved in many strikes against insurgent targets that have resulted in the deaths of many innocents, including children.
Currently, all these weapons have a human in the loop to decide when to apply lethal force. However, there are plans to create robots that can autonomously locate targets and destroy them without human intervention (12)--a high-priority agenda item for all the U.S. armed services (13, 14). Ground-based unmanned autonomous vehicles (UAVs) such as DARPA's Unmanned Ground Combat Vehicle (the PerceptOR Integration System) are already being created (15). The military contractor, BAE Systems, has "completed a flying trial which, for the first time, demonstrated the coordinated control of multiple UAVs autonomously completing a series of tasks" (16). These developments fit with a major goal of the Future Combat Systems project, with estimated costs to exceed $230 billion, to use robots as force multipliers; one soldier can be a nexus for initiating large-scale ground (17) and aerial robot attacks (13). Robot autonomy is required because one soldier cannot control several robots.
The ethical problems arise because no computational system can discriminate between combatants and innocents in a close-contact encounter. Computer programs require a clear definition of a noncombatant, but none is available. The 1944 Geneva Convention suggests common sense, while the 1977 Protocol 1 update defines a civilian as someone who is not a combatant (18). Even with a definition, sensing systems are inadequate for the discrimination challenge, particularly in urban insurgency warfare. These complexities are difficult to resolve even for experienced troops in the field. No computational inference systems yet exist that could deal with the huge number of circumstances where lethal force is inappropriate. These systems should not be confused with smart bombs or submunitions that require accurate human targeting.
Robots for care and for war represent just two of many ethically problematic areas that will soon arise from the rapid increase and spreading diversity of robotics applications. Scientists and engineers working in robotics must be mindful of the potential dangers of their work, and public and international discussion is vital in order to set policy guidelines for ethical and safe application before the guidelines set themselves.
References and Notes
IFR Statistical Department, World Robotics Report 2008 (www.worldrobotics.org).
C. Breazeal, Robot. Auton. Sys. 42, 167 (2003).
T. Fong, I. Nourbakhsh, K. Dautenhahn, Robot. Auton. Sys. 42, 143 (2003).
R. A. Brooks, IEEE J. Robot. Automat. 2, 14 (1986).
F. Tanaka, A. Cicourel, J. R. Movellan, Proc. Natl. Acad. Sci. U.S.A. 194, 46 (2007).
D. Blum, Love at Goon Park: Harry Harlow and the Science of Affection (Wiley, Chichester, UK, 2003).
Convention on the Rights of the Child, adopted and opened for signature, ratification, and accession by U.N. General Assembly Resolution 44/25, 20 November 1989.
J. Forlizzi, C. DiSalvo, F. Gemperle, Hum. Comput. Interact. 19, 25 (2004).
R. Sparrow, L. Sparrow, Minds Machines 16, 141 (2004).
R. Sparrow, Ethics Inform. Technol. 4, 305 (2002).
N. E. Sharkey, A. J. C. Sharkey, Artif. Intell. Rev. 25, 9 (2007).
N. E. Sharkey, IEEE Intell. Sys. 23, 14 (July-August 2008).
U.S. Department of Defense, Unmanned Systems Roadmap 2007-2032 (10 December 2007).
National Research Council, Committee on Autonomous Vehicles in Support of Naval Operations, Autonomous Vehicles in Support of Naval Operation (National Academies Press, Washington, DC, 2005).
Fox News, "Pentagon's 'Crusher' Robot Vehicle Nearly Ready to Go," 27 February 2008 (www.foxnews.com/story/0,2933,332755,00.html).
United Press International, "BAE Systems Tech Boosts Robot UAV's IQ," Industry Briefing, 26 February 2008 (http://bae-systems-news.newslib.com/story/3951-3226462).
U.S. Department of Defense, LSD (AT&L) Defense Systems/Land Warfare and Munitions 3090, Joint Robotics Program Master Plan FY2005 (2005).
Protocol 1 Additional to the Geneva Conventions, 1977 (Article 50).
Supported by a fellowship from the Engineering and Physical Sciences Research Council, UK.
10.1126/science.1164582
3 comments:
Thanks for posting this, a very insightful account by Noel Sharkey. I'd like to read more about the Korean and Israeli border patrol bots.
Here's a link to a 2007 story in The Register (UK) about the robotic guns on the Israeli border, and it also contains a link to an article about the Korean robots.
If you haven't seen the movie Doomsday, the scene showing unmanned robotic guns atop a great wall "protecting" society from assumed plague infected zombies is not far from the reality of the border guns in Israel as far as I can gather.
Post a Comment