Killer robots. Video-game warfare. Unlawful weapons. Terminators. Drone-attack commentary has become synonymous with reports of civilian carnage, claims of international-law violations, and worries about whether high-tech robotic wars have become too easy and fun to be effectively prevented. But the debate over drones is misleading the public about the nature of the weaponry and the law. It is also distracting attention from some more important and bigger issues: whether truly autonomous weapons should be permitted in combat, how to track the human cost of different weapons platforms and promote humanitarian standards in war, and whether targeted killings -- by drones or SEAL teams -- are lawful means to combat global terrorism. Based on our analysis of recent op-eds, we unpack four sets of misconceptions below and offer some sensible ways for the anti-drone lobby to reframe the debate.
No. 1: Drones Are "Killer Robots." This is actually two assumptions;
neither is precisely wrong, but both are misleading. First, drones themselves
are not necessarily "killers": They are used for many nonlethal
purposes as well. Drones (unmanned aerial vehicles)
can carry anything ranging from cameras to sensors to weapons and have been
deployed for nonlethal purposes such as intelligence gathering and surveillance
since the 1950s. Yet the nonlethal applications of drones are often lost in a
discussion that treats the technology per se as deadly; 90 percent of the
op-eds we analyzed focus solely on drones as killing machines.
Of course, it's true that drones can be used to kill. Some drones over Libya are now armed, and armed drones have been launching strikes in Afghanistan, Pakistan, and Yemen for years. Second, even weaponized drones are not "killer robots," despite the frequent reference in the op-eds we studied to "robotic weapons" or "robotic warfare." Their flight and surveillance systems are able to extract information from their environment and use it to move safely in a purposive manner, but the weapons themselves are controlled by a human operator and are not autonomous. With a human-in-the-loop navigating the aircraft and controlling the weapon, the "killer" aspect of these specific drones may be remote-controlled, but it's not robotic.
This important distinction is easily lost on a concerned public, but the distinction matters. Indeed, the debate over "killer robot drones" that actually aren't autonomous is preventing public attention from being directed to a more ground-breaking development in military technology: preparations to delegate targeting decisions to truly autonomous weapons platforms, many of which are not drones at all. As Brookings Institution scholar Peter W. Singer has argued, a shift toward fully autonomous weapons systems would represent a sea change in the very nature of war. Groups like the International Committee for Robot Arms Control have called for a multilateral discussion to stem or at least regulate these developments. Those worried about drones might usefully refocus their attention to on the debate over whether to keep humans in the loop for unmanned aerial vehicles and other weapons platforms globally. The big issue here is not drones per se. It is the extent to which life-and-death targeting decisions should ever be outsourced to machines.