Do Killer Robots Violate Human Rights?
Patrick Lin
2015-04-21 00:00:00
URL

As bizarre as it sounds, the United Nations just held an arms-control conference to figure out if killer robots might violate the laws of war.

Ten years ago, very few experts were worried about military robots. The technology was just emerging onto the battlefield. Now, several credible groups are waging war against killer robots, officially known as lethal autonomous weapons systems.

The UN returned to the subject last week in a five-day meeting of experts for the Convention for Certain Conventional Weapons. I was invited by the convention’s chairperson, the German Ambassador Michael Biontino, to speak about the problems that lethal autonomous weapons systems may create for human rights. This essay is adapted from my testimony and gives a glimpse at how this important debate is moving along. (These are my opinions alone and don't necessarily reflect the positions of UNIDIR or other organizations.)

The specific issue I was asked to address is whether killer robots, in making kill-decisions without human intervention, violate either a right to life or the "laws of humanity," as protected by the Martens Clause that has been in effect since the 1899 Hague Convention. (The Martens Clause requires nations to consider warfare through the lens of the “public conscience.”)

These concerns are a different kind than technology-based objections to killer robots. For instance, critics point out that artificial intelligence still can’t reliably distinguish between a lawful target (such as an enemy combatant with a gun) and an unlawful one (such as a civilian with an ice-cream cone), as demanded by the laws of war. Technology limitations, like this one and others, are possibly solvable over time. But if lethal autonomous weapons are truly an assault on human rights, that’s a philosophical challenge that can’t just be solved with better science and engineering. So it’s worth focusing on human rights as some of the most persistent problems for the killer robots, and I’ll keep that separate from technical issues to not confuse an already-complex debate.

Read the rest here.