Published by The Lawfare Institute
in Cooperation With
Announcing the inaugural paper of the Lawfare Research Paper Series
William C. Marra & Sonia K. McNeil, Understanding "The Loop": Autonomy, System Decision-Making, and the Next Generation of War Machines, Lawfare Research Paper Series 1-2012 (May 1, 2012), available here at Lawfare and here at SSRN.
Abstract: This paper responds to persistent confusion in public discussion about “autonomous” technologies. Unmanned aerial systems or “drones” are the focus of an especially intense debate that urgently requires greater clarity. The capabilities of these drones, and their relationship to their human operators, are too often described using imprecise language that unhelpfully complicates important questions about the technology. Should humans stay “in the loop” or “on the loop”? What is “the loop,” and what does it mean for humans to be “out” of it? Without a basic understanding of these technologies and a clear vocabulary with which to discuss them, lawyers and policymakers risk grounding foundational decisions about drones and other rapidly advancing, increasingly autonomous systems on inaccurate assumptions. Precision matters, for whether and to what degree technologies should be autonomous will be one of the most important public policy debates of the next generation. This paper offers a lexicon that lawyers and policymakers can use to analyze sophisticated systems and evaluate potential regulatory frameworks. Part I makes the case for clearer terminology and introduces the concepts of automation and autonomy. Part II first describes the common ground shared by automated and autonomous systems. In almost any machine system, the decision-making sequence is built on four basic stages: “observe,” “orient,” “decide,” and “act.” These four stages form the “loop” that is often invoked but rarely unpacked in debates about drones, especially the use of drones as a means of delivering kinetic force. After explaining the “loop,” Part II explores distinctions between automation and autonomy in complex systems through the three key attributes of machine autonomy: the frequency of human operator interaction with the machine; the machine’s ability to function successfully despite environmental uncertainty; and the machine’s level of assertiveness, or freedom to select among different courses of action. This investigation suggests that autonomy is better conceptualized as a spectrum subject to calibration by humans than as a technological end state. Part III turns to drones and traces their past development, present state, and future evolution. It concludes that while today’s drones are merely automated, future versions may approach true autonomy. Part IV transitions from technology to policy by sketching an answer to the question of what, exactly, is so new about advanced drone technology if today’s systems are not markedly more autonomous than older models. Part V explains how this paper’s discussion of machine systems and autonomy can be used today by lawyers and policymakers to structure a regulatory regime for tomorrow’s advanced drones. Advanced drones represent only one type of complex system, however, and so the vocabulary developed here also has broader applications. Part VI concludes.