Monday, November 23, 2009

Interview of Veruggio and Operto on Roboethics


Gianmarco Veruggio and Fioella Operto of the Scuolo di Robotica (Genova) were interviewed by Gerhard Dabringer. The full interview is available here.

GIANMARCO VERUGGIO
Roboethics is not the “Ethics of Robots”, nor any “ethical chip” in the hardware, nor any “ethical behavior” in the software, but it is the human ethics of the robots’ designers, manufacturers and users. In my definition, “Roboethics is an applied ethics whose objective is to develop scientific – cultural - technical tools that can be shared by different social groups and beliefs. These tools aim to promote and encourage the development of Robotics for the advancement of human society and individuals, and to help preventing its misuse against humankind.
Actually, in the context of the so-called Robotics ELS studies (Ethical, Legal, and Societal issues of Robotics) there are already two schools”. One, let us called it “Robot-Ethics” is studying technical security and safety procedures to be implemented on robots, to make them as much safe is possible for humans and the plant. Roboethics, on the other side, which is my position, concerns with the global ethical studies in Robotics and it is a human ethics.

FIORELLA OPERTO
Roboethics is an applied ethics that refers to studies and works done in the field of Science&Ethics (Science Studies, S&TS, Science Technology and Public Policy, Professional Applied Ethics), and whose main premises are derived from these studies. In fact, Roboethics was not born without parents, but it derives its principles from the global guidelines of the universally adopted applied ethics This is the reason for a relatively substantial part devoted to this matter, before discussing specifically Roboethics’ sensitive areas.
Many of the issues of Roboethics are already covered by applied ethics such as Computer Ethics or Bioethics. For instance, problems - arising in Roboethics - of dependability; of technological addiction; of digital divide; of the preservation of human identity, and integrity; the applications of precautionary principles; of economic and social discrimination; of the artificial system autonomy and accountability; related to responsibilities for (possibly unintended) warfare applications; the nature and impact of human-machine cognitive and affective bonds on individuals and society; have been already matters of investigation by the Computer ethics and Bioethics.

No comments: