Pages

Tuesday, May 21, 2013

‘Campaign to Stop Killer Robots’ calling for ban on ‘fully autonomous weapons’

from rawstory.com: The idea of autonomous killer robots may seem like the stuff of science fiction, but human rights groups are already preparing for what appears to be the future of weaponry. The Campaign to Stop Killer Robots, a coalition of international groups, is preparing for a global summit in Geneva Switzerland on Wednesday, May 29 that will review a U.N. report on these types of weapons that was released earlier this week. The Campaign hopes to convince nations to sign on to an international ban on autonomous weapons.

Raw Story spoke with Mary Wareham of the Arms Division of Human Rights Watch, who is the coordinator of the Campaign to Stop Killer Robots. The Campaign also includes representatives of Association for Aid and Relief Japan, the Nobel Women’s Initiative, the International Committee for Robot Arms Control and others. Wareham began by explaining that there is a difference between these autonomous weapons and armed drones.

“We’re calling the weapons we’re talking about ‘fully autonomous,’” she said. “The U.N. report calls them ‘lethal autonomous robotics.’

“Fully autonomous weapons have complete autonomy in terms of who they target and how they engage force,” she said. “And by autonomy, I mean no human operation, intervention or involvement. With armed drones, there is still what they call ‘the man in the loop.’ Unlike autonomous weapons, drones are still controlled by a human.”

She said that while some people might find the campaign’s focus far-fetched or outlandish, this is the direction weaponry in which weaponry is moving, toward greater and greater autonomy.


“Our objective is to see what comes next,” she said. She cautioned against believing that autonomous weapons would be humanoid, cyborg “Terminators” like in the film series.

“We’re not talking about Terminators, here,” she said, “or cyborgs, or whatever you want to call sophisticated killer robots that people think of from science fiction. We’re saying that autonomous weapons will come in all different shapes and sizes. And even the most rudimentary device may be lethal.”

Human Rights Watch issued a 50-page report in November entitled “Losing Humanity: The Case Against Killer Robots,” in which the organization discussed issues of international humanitarian law, of accountability and other threats to civilians posed by autonomous weapons.

It also listed current weapons systems in use in the world that are what HRW sees as precursors to autonomous weapons systems, including armed drones, but also much larger aircraft like the X-47B, a pilotless stealth fighter that launched from an aircraft carrier this week. Other weapons include stationary weapons systems like the Samsung SGR-1 robotwhich currently uses infrared vision to patrol the border between North and South Korea, directing machine-gun fire at warm, moving targets.

Currently, she said, the Korean device has to “signal back to base” and receive human permission to fire on targets. She does not expect that step to remain in place forever.

“There’s debate as to when fully autonomous weapons will show up,” Wareham said. “Some people say within 20 or 30 years, which we put in the report. Others say sooner.”

Also in November, the Pentagon issued a directive regarding autonomous weapons pledging that the U.S. military will keep humans in the loop with regards to decisions of targeting and the use of force. The upside of that, said Wareham, is that no other country is even considering a policy on autonomous weapons, even though all of the “usual suspects” (China, Russia and the U.S.) are exploring the technology.

The downside of the U.S. policy brief is that the treaty on autonomous weapons has to be renewed and is only in effect for the next 10 years at most.

When asked how current U.S. use of armed drones augurs for the future of autonomous weapons, Wareham said, “At Human Rights Watch, we’ve been working on armed drones for several years since they first started to deploy them. Our principle concern is with the use of armed drones, the rules regarding that, the compliance with international humanitarian law, the lack of transparency, the targeted killing policy, the role of the CIA and the Department of Defense, but all of those things relate to how drones are used” and will also have an impact on remote weapons when humans are removed from the process.

“We’ve got multiple concerns about drones,” she said, “but we’re also keeping an eye on the future and where this technology is heading.”

“We have to start asking the questions now. We have to start discussing it,” she said, “and start putting some rules and regulations in place around how we handle autonomy in warfare.”

In Geneva next Wednesday, the delegation from Human Rights Watch intends to ask for an international treaty on autonomous weapons, “but those discussions haven’t even started yet,” she said. “We’re at the very beginning here.”

According to the U.N. report, “lethal autonomous robotics raise far-reaching concerns about the protection of life during war and peace. This includes the question of the extent to which they can be programmed to comply with the requirements of international humanitarian law and the standards protecting life under international human rights law. Beyond this, their deployment may be unacceptable because no adequate system of legal accountability can be devised, and because robots should not have the power of life and death over human beings.”

No comments:

Post a Comment