Feeds

New Laws of Robotics proposed for US kill-bots

Droid-on-droid mayhem OK'd; machines to ask before snuffing humans

Top three mobile application threats

A new set of laws has been proposed to govern operations by killer robots. The ideas were floated by John S Canning, an engineer at the Naval Surface Warfare Centre, Dahlgren Division – an American weapons-research and test establishment. Mr Canning's “Concept of Operations for Armed Autonomous Systems” presentation can be downloaded here (pdf).

Many Reg readers will be familiar with the old-school Asimov Laws of Robotics, but these are clearly unsuitable for war robots – too restrictive. However, the new Canning Laws are certainly not a carte blanche for homicidal droids to obliterate fleshies without limit; au contraire.

Canning proposes that robot warriors should be allowed to mix it up among themselves freely, autonomously deciding to blast enemy weapon systems. Many enemy “systems” would, of course, be themselves robots, so it's clear that machine-on-machine violence isn't a problem. The difficulty comes when the automatic battlers need to target humans. In such cases Mr Canning says that permission from a human operator should be sought.

“Let machines target other machines,” he writes, “and let men target men.”

The concept document makes the point that various kinds of automated death-tech have been allowed to destroy machinery or even people for years. He cites anti-shipping missiles which are sometimes sent off over the horizon and told to look around for a target. Other examples include automatic air-defence systems such as Phalanx or Aegis which blast anything which comes at them too fast, or the “Captor” seabed system which torpedoes passing submarines but leaves surface ships alone.

It isn't really made clear how the ask-permission-to-kill-meatsacks rule could really be applied in these cases. Doppler radar is going to have trouble distinguishing between attacking manned jets and incoming missiles, for instance. Even if the two could be swiftly and reliably differentiated, adding a human reaction and decision period in an air-defence scenario may not be a survivable thing to do.

Mr Canning also says that the emphasis should be on destroying enemy weaponry rather than people.

“We can equip our machines with non-lethal technologies for the purpose of convincing the enemy to abandon their weapons prior to our machines destroying the weapons, and lethal weapons to kill their weapons,” he suggests.

This raises the prospect of American robot enforcers packing the crowd-cookers, strobe pacifier cannons or Star Trek puke blasters already reported by El Reg, and also some conventional exploding stuff. Once enemy troops had been partially grilled, rendered epileptic or incapacitated by vomit beams, presumably fleeing as a result, the droid assailants could blow up their abandoned tanks, artillery, ships or whatnot.

Of course, this might not work so well with personal enemy weaponry such as the ubiquitous AK47 or RPG. Interestingly, though, Mr Canning quotes airforce major R Craig Burton of the Judge Advocate General's Legal Centre:

“If people or property isn't a military objective, we don't target it. It might be destroyed as collateral damage, but we don't target it. Thus in many situations, we could target the individual holding the gun and/or the gun and legally there's no difference.”

Which seems to suggest that a robot could decide, under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear. Effectively the robot is allowed to disarm enemies by prying their guns from their cold dead hands.

El Reg's advice? Do what the droids say. They are our friends. ®

High performance access to file storage

More from The Register

next story
KILLER SPONGES menacing California coastline
Surfers are safe, crustaceans less so
LOHAN and the amazing technicolor spaceplane
Our Vulture 2 livery is wrapped, and it's les noix du mutt
Liftoff! SpaceX Falcon 9 lifts Dragon on third resupply mission to ISS
SpaceX snaps smartly into one-second launch window
KILLER ROBOTS, DNA TAMPERING and PEEPING CYBORGS: the future looks bright!
Americans optimistic about technology despite being afraid of EVERYTHING
R.I.P. LADEE: Probe smashes into lunar surface at 3,600mph
Swan dive signs off successful science mission
Discovery time for 200m WONDER MATERIALS shaved from 4 MILLENNIA... to 4 years
Alloy, Alloy: Boffins in speed-classification breakthrough
Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
Helium seeps from Falcon 9 first stage, delays new legs for NASA robonaut
prev story

Whitepapers

SANS - Survey on application security programs
In this whitepaper learn about the state of application security programs and practices of 488 surveyed respondents, and discover how mature and effective these programs are.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Top three mobile application threats
Learn about three of the top mobile application security threats facing businesses today and recommendations on how to mitigate the risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.