Saturday, April 11, 2015

Who is the war criminal when a military robot starts killing civilians?

First autonomous flying "drones", then driverless cars, and now it seems autonomous killer robots are being planned. If it's possible, and military planners seem to think it may be, just think of the problems.

Campaign to stop killer robots
In real life Asimov's "three laws of robotics" which prevent harm to humans do not exist. Such a law, which would work reliably, is not even possible. So if machines are built which are intended to kill people without any human control, they will inevitably kill "innocent civilians".

I know that war is messy and great evil happens in current wars with current weapons. But at least that evil is the direct result of a human action, and humans can be held to account - at least in principle.

photo: Get t y images
This is not possible with autonomous killing machines. Who would be guilty? The programming team? The software designers? The builders? The person who fitted the weapons or loaded the ammunition? The commander who ordered the deployment? The politician who authorised the purchase or the deployment in the field? Or maybe the voters who said nothing, and allowed these weapons to be deployed in their name?

This is why such weapons must not be allowed.

The UN is discussing the Inhumane Weapons Convention in a session which opens on 13th April. The Campaign to Stop Killer Robots is lobbying to have autonomous killing machines pre-emptively added to the list.

During this election campaign, when our politicians are at least pretending to listen to us, is a good time to challenge them about this issue.

Human Rights Watch has a detailed report on the dangers.

And the Guardian has a rather shorter article.

We must stop this!