Wednesday, March 05, 2008

Signs of Judgment Day, Part 1

From Gizmodo:

UK robotics professor Noel Sharkey is raising a fuss over the US Defense Department's intention to put $4 billion into "unmanned systems" in the next year or two. One fear is that spillover from all that R&D will give terrorists new ways to build effective GPS-guided suicide bombers for $500 or less.
"How long is it going to be before the terrorists get in on the act? With the current prices of robot construction falling dramatically and the availability of ready-made components for the amateur market, it wouldn't require a lot of skill to make autonomous robot weapons."
But Sharkey has other more philosophical issues, ones that echo Isaac Asimov's own concerns of more than a half century ago.

Says the New Scientist:

Sharkey is most concerned about the prospect of having robots decide for themselves when to "pull the trigger." Currently, a human is always involved in decisions of this nature. But the Pentagon is nearly two years into a research program aimed at having robots identify potential threats without human help.
But Ronald Arkin of Georgia Tech, the Siskel to Sharkey's Ebert, says that because a robot has no emotional baggage, it could be a much more "ethical" killer:
Arkin suggests trying to design ethical control systems that make military robots respect the Geneva Convention and other rules of engagement on the battlefield... "With a robot I can be sure that a robot will never harbour the intention to hurt a non-combatant," he says. "Ultimately they will be able to perform better than humans."
Hello!

Terminator? Battlestar Galactica? ED-209?

Read more...

No comments: