Wednesday, September 29, 2010

CIA charged with use of 'illegal, inaccurate code to target kill drones'

A story embarrassing to the CIA appeared in The Register on September 24th. The story is, CIA used 'illegal, inaccurate code to target kill drones':'They want to kill people with software that doesn't work'.
The CIA is implicated in a court case in which it's claimed it used an illegal, inaccurate software "hack" to direct secret assassination drones in central Asia.

The target of the court action is Netezza, the data warehousing firm that IBM bid $1.7bn for on Monday. The case raises serious questions about the conduct of Netezza executives, and the conduct of CIA's clandestine war against senior jihadis in Afganistan and Pakistan.

The dispute surrounds a location analysis software package - "Geospatial" - developed by a small company called Intelligent Integration Systems (IISi), which like Netezza is based in Massachusetts. IISi alleges that Netezza misled the CIA by saying that it could deliver the software on its new hardware, to a tight deadline.

When the software firm then refused to rush the job, it's claimed, Netezza illegally and hastily reverse-engineered IISi's code to deliver a version that produced locations inaccurate by up to 13 metres. Despite knowing about the miscalculations, the CIA accepted the software, court submissions indicate.

Tuesday, September 28, 2010

Dogs 1 Robots 0

The Dogs of War Get Their Due in New Jersey

In Iraq and Afghanistan, the nature of war has changed, forcing the Pentagon to retool for unconventional foes. Amid the push for robotic IED detectors and aerial drones, however, is renewed investment in another, less techie counterinsurgency tool: war dogs. While they’ve served in every modern conflict, no other war has so closely matched their particular skills—which helps explain why their ranks have more than doubled since 2001, from 1,300 to about 2,800 dogs, mostly German Shepherds. “The capability they bring”—to track snipers, smell explosives, and sense danger—“cannot be replicated by man or machine,” said Gen. David Petraeus in February 2008, according to an Air Force publication. He went on to urge investment in the animals, noting that “their yield outperforms any asset we have in our inventory.”

That, coupled with the fact that most dogs serve multiple tours and dozens have died in the current conflicts, compelled the U.S. War Dogs Association, a New Jersey–based nonprofit, to lobby for an official medal for canine service. Last month, the Pentagon demurred, saying medals are only for people. So the association designed its own two-inch-wide medal for deserving dogs nationwide. It’s shipped medals to about 30 dogs, including hounds at Fort Lewis in Washington state and Maryland’s Fort Meade.

Bored Predator Drone

Bored Predator Drone Pumps A Few Rounds Into Mountain Goat

From: the Onion

Monday, September 27, 2010

Was the Stuxnet Virus produced by the US, Israel, or another wealthy nation?

Many of you may have noticed stories this week about the Stuxnet virus, which propose that a virus of this sophistication could have only been created by a large directed effort, probably that of a wealthy nation. The apparent target of the virus is vulnerabilities in Iran's IT industry. Stuxnet specifically targets software developed by Siemans AG. It is presumed that China, Russia, Israel, Britain, Germany and the United States are the countries most likely to have initiated this new venture in cyberwarfare.

Read the AP article titled, Computer Attacks Linked to Wealthy Group or Nation.

Sunday, September 26, 2010

More on Robot Deception

While in Berlin, Germany last week with Ron Arkin and Colin Allen, the IEET published the question, "Do we need a law making it illegal for computers and robots to deceive or be dishonest?" The question had been stimulated by recent articles about research performed by Ron and research engineer Alan Wagner. Ron was particularly pleased that this research had gotten people to ask questions such as this. Stimulating reflection on serious ethical concerns has always been one of his goals.

However, our conversations went in a somewhat different direction. Is the publicity creating the impression that the relatively low level mechanisms Arkin and Wagner introduced into their experiment are the equivalent of higher level cognitive ability? In other words, are we feeding a false impression that robots are much more sophisticated than they are, or are likely to be in the foreseeable future?

Ron pointed out that the actual research and the press release that accompanied it was responsible, and as we all know the press can distort scientific findings for its own purposes.

Here are some addition links for those interested in this subject.

Click here to link to the research paper titled, Acting Deceptively: Providing Robots with the Capacity for Deception
Hyperlink to the original Press Release.
Article at NewScientist titled, Deceptive robots hint at machine self-awareness.
Vote on the question at Polldaddy and view the results.

While roughly 60% favor outlawing or restricting deceptive robots, b ut only half of these thought it enforceable.

Society for Philosophy and Technology: Technology and Security


From May 26-29, 2011, the University of North Texas will host the 17th international conference of the Society for Philosophy and Technology:

The conference theme is "Technology and Security," but papers reflecting on any aspect of technology are welcomed. We also welcome interdisciplinary submissions from those studying technology in fields other than philosophy. See the call for papers here: Abstracts can be submitted to: Please note the abstract submission deadline is November 1, 2010.

The keynote speaker is P.W. Singer, Senior Fellow and Director of the 21st Century Defense Initiative at the Brookings Institution and author of Wired for War: The Robotics Revolution and 21st Century Combat.

ETHICOMP 2011: The Social Impact of Social Computing

Sheffield Hallam University, Sheffield, UK

Wednesday 14 September to Friday 16 September 2011

Call for Papers to the 12th ETHICOMP conference:“The social impact of social computing”.

The overall theme of ETHICOMP 2011 is the huge range of impacts on us all of advances in social computing. Under this theme, papers, with a social/ethical perspective, within the following areas are particularly welcomed.

Online communities - Blogs, wikis, social networks, collaborative bookmarking, social tagging, podcasts, tweeting, augmented reality
Business and public sector - Recommendation, forecasting, reputation, feedback, decision analysis, e-government, e-commerce
Interactive entertainment - Edutainment, training, gaming, storytelling
Web technology
Database technology
Multimedia technology
Wireless technology
Agent technology
Software engineering
Social psychology
Communication and human-computer interaction theories
Social network analysis
Organisation theory
Computing theory
Ethical theory
Information and computer ethics
Papers covering one or several of these perspectives are called for from business, government, computer science, information systems, law, media, anthropology, psychology, sociology and philosophy. Interdisciplinary papers and those from new researchers and practitioners are encouraged. A paper might take a conceptual, applied, practical or historical focus. Case studies and reports on lessons learned in practice are welcomed.

The full announcement is available here.

Saturday, September 25, 2010

Call to Establish an Arms Control Regime for Robots

A strong call to limit armed tele-operated and autonomous systems came out of the workshop in Berlin this past week. What follows is an excerpt from the full statement.
We believe:
 That the long-term risks posed by the proliferation and further development of these weapon systems outweigh whatever short-term benefits they may appear to have.
 That it is unacceptable for machines to control, determine, or decide upon the application of force or violence in conflict or war.* In all cases where such a decision must be made, at least one human being must be held personally responsible and legally accountable for the decision and its foreseeable consequences.
 That the currently accelerating pace and tempo of warfare is further escalated by these systems and undermines the capacity of human beings to make responsible decisions during military operations.
 That the asymmetry of forces that these systems make possible encourages states, and non-state actors, to pursue forms of warfare that reduce the security of citizens of possessing states.
 That the fact that a vehicle is uninhabited does not confer a right to violate the sovereignty of states.

There is, therefore, an urgent need to bring into existence an arms control regime to regulate the development, acquisition, deployment, and use of armed tele-operated and autonomous robotic weapons.

The full statement can be read here.

You Can Teach a Quadrotor Drone New Trick

Friday, September 17, 2010

Robot Arms Control Workshop in Berlin, Germany

Many of the people familiar to readers of this blog will be coming together for a three day workshop (Sept 20th-22nd) in Berlin, Germany to discuss various calls for international arms treaties directed at regulating the roboticization of warfare. The workshop has been organized by the International Committee for Robot Arms Control (ICRAC). Among the workshop participants will be: Jürgen Altmann, Ron Arkin, Peter Asaro, ,Dennis Gormley, Joanne Mariner, Eugene Miasnikov, Götz Neuneck, Elizabeth Quintana, Wolfgang Richter, Lambèr Royakkers, Niklas Schörnig, Noel Sharkey, Rob Sparrow, Mark Steinbeck, Detlev Wolter, Uta Zapf, Colin Allen, and myself.

The Guardian published an article yesterday (Sept 16th) discussing the conference and the need for robot arms control. Read the full article titled, Robot warfare: campaigners call for tighter controls of deadly drones:Conferences will raise concerns over unpiloted aircraft and ground machines that choose their own targets.

Cyborgs on Mars

In the September issue of Endeavour, senior curator at the Smithsonian National Air and Space Museum Roger Launius takes a look at the historical debate surrounding human colonization of the solar system and how human biology will have to adapt to such extreme space environments. . .

If humans are to colonize other planets, Launius said it could well require the "next state of human evolution" to create a separate human presence where families will live and die on that planet. In other words, it wouldn't really be Homo sapien sapiens that would be living in the colonies, it could be cyborgs—a living organism with a mixture of organic and electromechanical parts—or in simpler terms, part human, part machine. . .

The possibility of using cyborgs for space travel has been the subject of research for at least half a century. An influential article published in 1960 by Manfred Clynes and Nathan Kline titled “Cyborgs and Space” changed the debate. According to them, there was a better alternative to recreating the Earth’s environment in space, the predominant thinking during that time. The two scientists compared that approach to “a fish taking a small quantity of water along with him to live on land.” They felt that humans should be willing to partially adapt to the environment to which they would be traveling.

“Altering man’s bodily functions to meet the requirements of extraterrestrial environments would be more logical than providing an earthly environment for him in space,” Clynes and Kline wrote. . .

Grant Gillett, a professor of medical ethics at the Otago Bioethics Center of the University of Otago Medical School in New Zealand said addressing the ethical issue is really about justifying the need for such an approach, the need for altering humans so significantly that they end up not entirely human in the end.

“(Whether we) should do it largely depends on if it's important enough for humanity in general,” Gillett said. “To some extent, that's the justification.”

Read the full article titled, Cyborgs Needed for Escape from Earth in Astrobiology Magazine from which these excerpts were extracted.

The Future of Context-Aware Computing

Justin Rattner, Intel VP and Chief Technology Office described the future of context-sensitive computing (devices that anticipate needs and desire and try to fulfill them), during a keynote at the Intel Developer Forum.
Rattner devoted most of his keynote to explaining and demonstrating how Intel is researching and pursuing making that mainstream intention a reality. These included:
• Tim Jarrell, the Vice President and Publisher of Fodor's Travel, arrived onstage to demonstrate a new Fodor's app (created in collaboration with Intel) that can recommend restaurants based on what the user likes and eats, and the user's location in the city. When used in "Wander" mode, the app helps center the user by providing him information about surrounding landmarks. (A very similar technology, named Augmented Reality, was demonstrated at the Intel Labs "Zero Day" IDF event on Sunday.) The app is not available yet, but Fodor's is continuing development on it.
• Intel Research Scientist Lama Nachman demonstrated the use of "shimmer sensors," wearable sensors that measure stride time and swing time and showed charts that measured Rattner's movements onstage during his speech (he had been wearing them on his ankles). This technology was intended to help measure the gait of elderly people who had difficulty walking.
• A remote control that "enhances the smart TV experience" by recognizing who's holding a remote control and adjusting the viewing experience accordingly.
• A sense system, roughly the size of a large cell phone, that could animate avatars to let you know a person's current activity or state of activities. One example was how someone sitting and drinking coffee might receive a phone call and leave the coffee shop, by showing how the device would animate a troll-like creature at first sitting and then walking while talking on a cell phone.

Read the full PCMAG article titled, Rattner Describes the Future of Context-Aware Computing.

Brain Controlled Wheelchair

Researchers at the Federal Institute of Technology in Lausanne have developed a wheelchair that can be controlled by patients with their thoughts. The technology combines an electroencephalograph (EEG) with software that interpolates the intent of the patient.
EEG has limited accuracy and can only detect a few different commands. Maintaining these mental exercises when trying to maneuver a wheelchair around a cluttered environment can also be very tiring, says, José del Millán, director of noninvasive brain-machine interfaces at the Federal Institute of Technology, who led the project. "People cannot sustain that level of mental control for long periods of time," he says. The concentration required also creates noisier signals that can be more difficult for a computer to interpret.

Shared control addresses this problem because patients don't need to continuously instruct the wheelchair to move forward; they need to think the command only once, and the software takes care of the rest. "The wheelchair can take on the low-level details, so it's more natural," says Millán.

The wheelchair is equipped with two webcams to help it detect obstacles and avoid them. If drivers want to approach an object rather than navigate around it, they can give an override command. The chair will then stop just short of the object.

Read the full article from Technology Review titled, Wheelchair Makes the Most of Brain Control:Artificial intelligence improves a wheelchair system that could give paralyzed people greater mobility.

Sunday, September 12, 2010

Ryan Calo Interviewed by Robots Podcast

Ryan Calo, a senior research fellow at Stanford Law School, who also founded the Stanford Robots and Law Blog was interviewed by robotspodcast. The full Interview can be played here.

Survey on Attitudes Regarding Unmanned Systems

Gerhard Dabringer conducted a Survey on Unmanned Systems at AUVSI in Denver in August. He has made his findings available in a Summary Report available here. Among his findings are:

1.The use of Robotic Combat Systems (RCS) is generally approved of, though there is a strong tendency towards the „man in the loop“ approach, especially when systems are weaponized.

2. There is a strong need for a broad discussion of ethical aspects as well as legal aspects of RCS.

3. Policy makers need to make sure that the existing discussions are beeing noticed.

4. RCS are recognized as a new ethical dimension in warfare and a majority sees the need for new international legislation.

5. Autonomous use of weapons by the RCS is generally not approved of.

Deceptive Robots

Gizmag describes research by Ronald Arkin and Alan Wagner in which robots are taught to deceive.
What it all boiled down to was a series of 20 hide-and-seek experiments. The autonomous hiding/deceiving robot could randomly choose one of three hiding spots, and would have no choice but to knock over one of three paths of colored markers to get there. The seeking robot could then, presumably, find the hiding robot by identifying which path of markers was knocked down. Sounds easy, except that sneaky, conniving hiding robot would turn around after knocking down one path of markers, and go hide in one of the other spots.

In 75 percent of the trials, the hiding robot succeeded in evading the seeking robot. In the other 25 percent, it wasn’t able to knock down the right markers necessary to produce its desired deception. The full results of the Georgia Tech experiment were recently published in the International Journal of Social Robotics.

The full research article is title, Acting Deceptively:Providing Robots with the Capacity for Deception.
Abstract Deception is utilized by a variety of intelligent systems ranging from insects to human beings. It has been argued that the use of deception is an indicator of theory of mind (Cheney and Seyfarth in Baboon Metaphysics: The Evolution of a Social Mind, 2008) and of social intelligence (Hauser in Proc. Natl. Acad. Sci. 89:12137–12139, 1992). We use interdependence theory and game theory to explore the phenomena of deception from the perspective of robotics, and to develop an algorithm which allows an artificially intelligent system to determine if deception is warranted in a social situation. Using techniques introduced in Wagner (Proceedings of the 4th International Conference on Human-Robot Interaction (HRI 2009), 2009), we present an algorithm that bases a robot’s deceptive action selection on its model of the individual it’s attempting to deceive. Simulation and robot experiments using these algorithms which investigate the nature of deception itself are discussed.

Thursday, September 2, 2010

Newer videos of ECCEROBOT

Is a Robot Crime Wave on the Near Horizon?

The Coming Robot Crime Wave is an article by Noel Sharkey, Marc Goodman, and Nick Ross, which outlines a number of ways in which present and future robotic systems will be adapted to perpetrate a wide variety of illegal activities. One example they discuss is Narco submarines.
Major criminal organizations such as drug cartels don’t need to rely on cheap home engineering. Discoveries of submarines designed to carry tons of narcotics have been occurring since 1988. With 10 tons of cocaine netting $200 million, $2 million for a submarine would repay the robot’s cost many times over in one voyage. The drug cartels clearly have the money to adapt their technology to keep ahead of enforcement agencies.

Once the exclusive and secretive preserve of the military, this technology is becoming commonplace in civilian applications, with marine robots a prime example. So far, they’ve been used to locate the Titanic, investigate ice caps, build deep sea oil rigs, repair undersea cables, and mitigate environmental catastrophes such as the recent Deepwater Horizon explosion in the Gulf of Mexico.

In 2010, US officials secured the first convictions for remote-controlled drug smuggling when they imprisoned three men for building and selling drug subs ( b8Qawc). At the Tampa hearing, attorney Joseph K. Ruddy reported that these remote-controlled submarines were up to 40 feet long and could carry 1,800 kilograms of cocaine 1,000 miles without refueling. The effectiveness of these submarines in avoiding detection is clear, given that none have ever been seized. We only hear about the criminals’ failures, so there could be none, dozens, or hundreds of these machines in use.

The latest autonomous and semiautonomous submarine capabilities pose a greater concern. They can act on their own when required, employ programmed avoidance routines to thwart authorities, be fitted with sensors to send signals to the operator when the payload is delivered or the craft attacked, and carry self-destruct features to destroy incriminating evidence.

TILT 2011: Technologies on the stand: legal and ethical questions in neuroscience and robotics.

The Tilburg Institute for Law, Technology, and Society (TILT) is proud to announce the upcoming TILTing Perspectives 2011 conference entitled

"Technologies on the stand: legal and ethical questions in neuroscience and robotics."

The conference will be held at Tilburg University (the Netherlands) on 11 and 12 April 2011. It will focus on the legal and ethical questions raised by the application of neuroscience and robotics in various contexts. The conference will have two independent, but related tracks:

1. Law and neuroscience
The first track will focus on the legal and ethical issues surrounding recent developments in neuroscience and the legal application of neurotechnologies. Discussion topics will include, but are not limited to:
- the possible use of neurotechnologies in a legal context and the implications thereof,
- the role of neuroscience in determining legal capacities and in detecting deception,
- the legal and ethical issues surrounding the medical application of neurotechnologies, and
- the legal and ethical implications of using neurotechnologies for enhancement purposes.

2. Law, ethics and robotics
The second track will focus on the legal and ethical implications of the application of robotics in social environments (e.g., the home, hospitals and other health care institutes, in traffic, but also in war). Discussion topics will include, but are not limited to:
- the legal and ethical questions raised by the proliferation of robotics for the home environment,
- the legal and ethical questions raised by the deployment of robotics in war,
- liability and the legal status of robots, and
- autonomous action, agency and the ethical implications thereof.

The conference aims at bringing together national and international experts from the fields of (1) law and neuroscience and (2) law, ethics and robotics, and to facilitate discussion between lawyers, legal scholars, psychologists, social scientists, philosophers, neuroscientists and policy makers.

Our confirmed keynote speakers are:
- Stephen Morse (University of Pennsylvania)
- Paul Wolpe (Emory University)
- Wendell Wallach (Yale University)
- Noel Sharkey (University of Sheffield)

If you would like to present a paper at this conference, please send in an abstract (of max. 350 words) using the abstract submission system on our website:

Abstract submission is open from 1 September until 15 October. You may submit an abstract on the topics suggested above, or on a related topic that falls within the conference theme.

Full papers will be published in the conference proceedings. The winning paper in the Best Paper Contest will be published in a special edition of the international, peer reviewed journal Law, Innovation and Technology (Hart Publishers).

Important dates for submission:
- Deadline for submission of abstract: 15 October 2010
- Notification of acceptance and invitation to write a full paper: 1 November 2010
- Deadline for submission of full papers: 15 December 2010
- Reviewers' feedback and comments: 31 January 2011
- Deadline for submission of revised papers: 15 March 2011
- Conference dates: 11 and 12 April

For more information, please visit our website: