Published daily by the Lowy Institute

Killer robots: Not Terminators, just land mines with microchips

Killer robots: Not Terminators, just land mines with microchips
Published 29 May 2013   Follow @SamRoggeveen

I've just recorded an interview with a reporter from ABC Radio National's estimable PM program on the subject of 'killer robots'. The UN Human Rights Commission has released what looks to be an interesting draft paper on the subject, and its due to be debated in Geneva tonight.

The key distinction the report makes between 'drones' (which already roam the battlefield) and 'robots' (which are still in the future) is human agency. Armed drones have a human decision-maker 'in the loop', meaning the final decision to use lethal force is made not by a machine but by a person or probably several people. In the case of lethal autonomous robots, on the other hand:

...targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill–and their execution.

Note 'more detached', not 'unattached'.

Here's the key point I tried to make in the interview: what's often missed in media stories about killer robots (inevitably accompanied by photos of Terminators) is that even if they do enter the battlefield, humans still ultimately make the decision to use lethal force. After all, it is people who will program these robots and deploy them.

The difference is that, whereas in the case of drones, the human operator is removed from the act of using force by distance, in the case of robots, the human is removed by distance and time. And when you think about it that way, you realise killer robots are not that different to land mines or booby traps.

The problem, as the UN report points out, is that computer programming can never match the human capacity to make quick judgments about moral principles such as proportionality (use only as much force as necessary) and discrimination (direct force only at aggressors, not bystanders). Then again, such programming can also help us to bypass our passions, which lead us to do all sorts of hideous things in war.

Photo courtesy of Northrop Grumman.



You may also be interested in