Feeds

New Laws of Robotics proposed for US kill-bots

Droid-on-droid mayhem OK'd; machines to ask before snuffing humans

Providing a secure and efficient Helpdesk

A new set of laws has been proposed to govern operations by killer robots. The ideas were floated by John S Canning, an engineer at the Naval Surface Warfare Centre, Dahlgren Division – an American weapons-research and test establishment. Mr Canning's “Concept of Operations for Armed Autonomous Systems” presentation can be downloaded here (pdf).

Many Reg readers will be familiar with the old-school Asimov Laws of Robotics, but these are clearly unsuitable for war robots – too restrictive. However, the new Canning Laws are certainly not a carte blanche for homicidal droids to obliterate fleshies without limit; au contraire.

Canning proposes that robot warriors should be allowed to mix it up among themselves freely, autonomously deciding to blast enemy weapon systems. Many enemy “systems” would, of course, be themselves robots, so it's clear that machine-on-machine violence isn't a problem. The difficulty comes when the automatic battlers need to target humans. In such cases Mr Canning says that permission from a human operator should be sought.

“Let machines target other machines,” he writes, “and let men target men.”

The concept document makes the point that various kinds of automated death-tech have been allowed to destroy machinery or even people for years. He cites anti-shipping missiles which are sometimes sent off over the horizon and told to look around for a target. Other examples include automatic air-defence systems such as Phalanx or Aegis which blast anything which comes at them too fast, or the “Captor” seabed system which torpedoes passing submarines but leaves surface ships alone.

It isn't really made clear how the ask-permission-to-kill-meatsacks rule could really be applied in these cases. Doppler radar is going to have trouble distinguishing between attacking manned jets and incoming missiles, for instance. Even if the two could be swiftly and reliably differentiated, adding a human reaction and decision period in an air-defence scenario may not be a survivable thing to do.

Mr Canning also says that the emphasis should be on destroying enemy weaponry rather than people.

“We can equip our machines with non-lethal technologies for the purpose of convincing the enemy to abandon their weapons prior to our machines destroying the weapons, and lethal weapons to kill their weapons,” he suggests.

This raises the prospect of American robot enforcers packing the crowd-cookers, strobe pacifier cannons or Star Trek puke blasters already reported by El Reg, and also some conventional exploding stuff. Once enemy troops had been partially grilled, rendered epileptic or incapacitated by vomit beams, presumably fleeing as a result, the droid assailants could blow up their abandoned tanks, artillery, ships or whatnot.

Of course, this might not work so well with personal enemy weaponry such as the ubiquitous AK47 or RPG. Interestingly, though, Mr Canning quotes airforce major R Craig Burton of the Judge Advocate General's Legal Centre:

“If people or property isn't a military objective, we don't target it. It might be destroyed as collateral damage, but we don't target it. Thus in many situations, we could target the individual holding the gun and/or the gun and legally there's no difference.”

Which seems to suggest that a robot could decide, under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear. Effectively the robot is allowed to disarm enemies by prying their guns from their cold dead hands.

El Reg's advice? Do what the droids say. They are our friends. ®

Internet Security Threat Report 2014

More from The Register

next story
MARS NEEDS WOMEN, claims NASA pseudo 'naut: They eat less
'Some might find this idea offensive' boffin admits
LOHAN crash lands on CNN
Overflies Die Welt en route to lively US news vid
Comet Siding Spring revealed as flying molehill
Hiding from this space pimple isn't going to do humanity's reputation any good
Experts brand LOHAN's squeaky-clean box
Phytosanitary treatment renders Vulture 2 crate fit for export
No sail: NASA spikes Sunjammer
'Solar sail' demonstrator project binned
Carry On Cosmonaut: Willful Child is a poor taste Star Trek parody
Cringeworthy, crude and crass jokes abound in Steven Erikson’s sci-fi debut
Origins of SEXUAL INTERCOURSE fished out of SCOTTISH LAKE
Fossil find proves it first happened 385 million years ago
Human spacecraft dodge COMET CHUNKS pelting off Mars
Odyssey orbiter yet to report, though - comet's trailing trash poses new threat
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.