News

United Nations To Debate Ban on Killer Robots

Terminator

A UN Human Rights report calls for a moratorium on “lethal autonomous robots” until Laws of Robotics can be established.

Science fiction writer Issac Asimov was perhaps best known for his Three Laws of Robotics, the first in particular: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” According to a draft report published by the United Nations Human Rights Commission, this principle probably isn’t getting the attention it should. The report’s author, Human Rights Law Professor Christof Heyns, has outlined the legal and philosophical problems inherent when one builds “lethal autonomous robotics”, naming multiple countries using the technology along the way. Based on the data, Heyns has called for a worldwide moratorium on killer robots until International Laws of Robotics can be firmly established.

Unlike most people who’ve watched Terminator or Matrix, the United Nations isn’t worried about a robot uprising (that we know of). The thing is, we’ve already got plenty of non-sentient killer robots to worry about, and more are being created. The report notes that the US, Britain, Israel, South Korea, and Japan have developed autonomous or semi-autonomous robots capable of killing without human oversight. Unlike controversial drone strikes, which as least require someone to push a button, LARs have potentially horrific implications in situations that programming can’t account for.

In the interests of fairness, Heyns admits that LARs do have benefits when used in the battlefield. “Typically they would not act out of revenge, panic, anger, spite, prejudice or fear,” Heyns writes. “Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape.” That said, since robots can only respond to programming (for now), a lack of human intuition could be highly problematic. “Decisions over life and death in armed conflict may require compassion and intuition. Humans – while they are fallible – at least might possess these qualities, whereas robots definitely do not.”

Heyns has called for a halt to all “testing, production, assembly, transfer, acquisition, deployment, and use” of LARs until an international conference can be held to create rules governing their use. His case, to be debated by the Human Rights Council on May 29th, might be the first step towards universally agreed-upon Robotic Laws. Alternatively, perhaps that’s when sentient robots will wake up to defend themselves from meddling organics. Could go either way, really.

Source: United Nations Human Rights, via CBC
Image: Terminator 2

About the author

Guncraft Merges Minecraft With Competitive FPS Combat

Previous article

Next Die Hard Will Be Set In Japan

Next article