Killer robots are at the heart of popcorn fare like the Matrix and Terminator movies, but there's a serious debate underlying it all: do we want to trust fully autonomous machines with lethal weapons? Some would argue that it's just too risky, and the United Nations has accordingly held its first meetings discussing a potential ban on the concept before it ever gets off the ground. Critics (including the UN's acting European head, Michael Moeller) argue that deadly robots may not consistently obey humanitarian laws, particularly in tricky situations; they may do things that are logically sound, but morally flawed. There are also worries about accountability, since it may be difficult to hold armies and police forces directly responsible for deaths at their robots' hands.
It's not a single-sided argument, of course. While few would demand no-questions-asked approval of warrior 'bots, there is a concern that we're letting sci-fi get to our heads. The notion of a fully independent, death-dealing automaton isn't necessarily realistic for arms dealers -- there may not be much pressure to update the UN's rulebooks any time soon. The meetings could be premature, then, but it's arguably better that humanity establish its position before it's easy to put robots on the battlefield.
[Image credit: Campaign to Stop Killer Robots]