Researchers developing intelligent systems for use in caring for the increasingly aging population met this weekend in Arlington, VA to discuss new trends in passive sensing with vision and machine learning, environments for eldercare technology research, robotics for assistive and therapeutic use, and human-robot interaction. A panel on machine ethics discussed various ethical ramifications of these and other such technologies and called for the need for the incorporation of an ethical dimension in these technologies. A video was shown that displayed the need for such a dimension in even the most seemingly innocuous systems. The system in question is a simple mobile robot with a very limited repertoire of behaviors which amount to setting and giving reminders. A number of questionable ethical practices were uncovered.
One involved, after asking if she had taken her medication, asking the system's charge to show her empty pill box. This is followed by a lecture by the system concerning how important it is for her to take her medication. There is little back story in the video but, assuming a competent adult, such paternalistic behavior from the system seems uncalled for and shows little respect for the patient's autonomy.
During this exchange, the patient's responsible relative is seen watching it over the internet. Again, it is not clear if this surveillance has been agreed to by the person being watched, and in fact there is no hint in the video that she indeed knows she is being watched, but there is the distinct impression that her privacy is being violated.
As another example, promises are made by the system that it will remind its charge when her favorite show and "the game" are on. Promise making and keeping clearly have ethical ramifications and it is not clear that the system under consideration has the sophistication to make ethically correct decisions when the duty to keep promises comes in conflict other possibly more important duties.
Finally, when the system does indeed remind its charge that her favorite television show is starting, it turns out that she has company and she tells the robot to go away. The system responds with "You don't love me anymore" to the delight of the guests and slinks away. This is problematic behavior in that is sets up an expectation in the user that the system is incapable of fulfilling-- that it is capable of a loving relationship with its charge. This is a very highly charged ethical ramification particularly given the vulnerable population for which this technology is being developed.
The bottom line is, contrary to those who argue that concern about the ethical behavior of autonomous systems is premature, the example transgressions of the most simple of such systems shows that in fact such concern is overdue.