You know, when armchair futurists (and jive talkin' bloggists) make note of some of the scary new tech making the rounds in defense circles these days it's one thing, but when the Doomsday Scenarios come from official channels, that's when we start to get nervous. According to a report published by the California State Polytechnic University (with data made available by the U.S. Navy's Office of Naval Research) the sheer scope of the military's various AI projects is so vast that it is impossible for anyone to fully understand exactly what's going on. "With hundreds of programmers working on millions of lines of code for a single war robot," says Patrick Lin, the chief compiler of the report, "no one has a clear understanding of what's going on, at a small scale, across the entire code base." And what we don't understand can eventually hunt us down and kill us. This isn't idle talk, either -- a software malfunction just last year caused US. Army robots to aim at friendly targets (fortunately, no shots were fired). The solution, Dr. Lin continues, is to teach robots "battlefield ethics... a warrior code." Of course, the government has had absolutely no problems with ethics over the years -- so programming its killer robots with some rudimentary values should prove relatively simple.
Navy report warns of robot uprising, suggests a strong moral compass

J. Flatley|02.18.09
Sponsored Links

February 18, 2009 5:55 PM
In this article: ai, artificial intelligence, ArtificialIntelligence, California State, California State Polytechnic University, CaliforniaState, CaliforniaStatePolytechnicUniversity, defense, Dr. Patrick Lin, Dr.PatrickLin, ethics, military, Navy, office of naval research, OfficeOfNavalResearch, Patrick Lin, PatrickLin, robot, robot apocalypse, RobotApocalypse, robotics

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.