Hostname: page-component-cd9895bd7-mkpzs Total loading time: 0 Render date: 2024-12-27T10:13:55.912Z Has data issue: false hasContentIssue false

Stepping back from the brink: Why multilateral regulation of autonomy in weapons systems is difficult, yet imperative and feasible

Published online by Cambridge University Press:  03 February 2021

Abstract

This article explains why regulating autonomy in weapons systems, entailing the codification of a legally binding obligation to retain meaningful human control over the use of force, is such a challenging task within the framework of the United Nations Convention on Certain Conventional Weapons. It is difficult because it requires new diplomatic language, and because the military value of weapon autonomy is hard to forego in the current arms control winter. The article argues that regulation is nevertheless imperative, because the strategic as well as ethical risks outweigh the military benefits of unshackled weapon autonomy. To this end, it offers some thoughts on how the implementation of regulation can be expedited.

Type
Artificial intelligence, autonomous weapon systems and their governance
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press on behalf of the ICRC.

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 United Nations Office in Geneva, “The Convention on Certain Conventional Weapons”, available at: https://tinyurl.com/y4orq8q5 (all internet references were accessed in December 2020).

2 UN, Meeting of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects: Revised Draft Final Report, UN Doc. CCW/MSP/2019/CRP.2/Rev.1, Geneva, 15 November 2019 (CCW Meeting Final Report), p. 5, available at: https://tinyurl.com/y3gjy7mk.

3 Ibid., p. 10.

4 Russian Federation, Potential Opportunities and Limitations of Military Uses of Lethal Autonomous Weapons Systems: Working Paper Submitted by the Russian Federation, UN Doc. CCW/GGE.1/2019/WP.1, 15 March 2019, p. 5, available at: https://tinyurl.com/yx9op3n4.

5 German Federal Foreign Office, “Foreign Minister Maas on Agreement of Guiding Principles relating to the Use of Fully Autonomous Weapons Systems”, press release, 15 November 2019, available at: www.auswaertiges-amt.de/en/newsroom/news/maas-autonomous-weapons-systems/2277194.

6 KRC, “Alarm Bells Ring on Killer Robots”, 15 November 2019, available at: www.stopkillerrobots.org/2019/11/alarmbells/; Richard Moyes, “Critical Commentary on the ‘Guiding Principles’”, Article 36, November 2019, available at: www.article36.org/wp-content/uploads/2019/11/Commentary-on-the-guiding-principles.pdf.

7 Future of Life Institute (FLI), “Autonomous Weapons: An Open Letter from AI and Robotics Researchers”, 28 July 2015, available at: https://futureoflife.org/open-letter-autonomous-weapons/; FLI, “An Open Letter to the United Nations Convention on Certain Conventional Weapons”, 21 August 2017, available at: https://futureoflife.org/autonomous-weapons-open-letter-2017/.

8 Mary Wareham, “As Killer Robots Loom, Demands Grow to Keep Humans in Control of Use of Force”, Human Rights Watch, 2020, available at: www.hrw.org/world-report/2020/country-chapters/killer-robots-loom-in-2020.

9 The need to arrive at a shared definition of LAWS remains a common notion among the CCW States Parties, and some still view it as a prerequisite for the talks to go anywhere. As an example for this line of thought, see the chair's summary of the discussion of the 2019 GGE meeting: “Some delegations chose to address the issue of definitions, with several different views on the need for definitions – working or otherwise – to make further progress in the work of the Group.” UN, Report of the 2019 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems: Chair's Summary, UN Doc. CCW/GGE.1/2019/3/Add.1, 8 November 2019, p. 3, available at: https://tinyurl.com/y68rzkub.

10 Léonard van Rompaey, “Shifting from Autonomous Weapons to Military Networks”, Journal of International Humanitarian Legal Studies, Vol. 10, No. 1, 2019, pp. 112–119, available at: https://brill.com/view/journals/ihls/10/1/article-p111_111.xml.

11 Elvira Rosert and Frank Sauer, “How (Not) to Stop the Killer Robots: A Comparative Analysis of Humanitarian Disarmament Campaign Strategies”, Contemporary Security Policy, 30 May 2020, available at: https://tinyurl.com/y23o8lo6.

12 Goldblat, Jozef, Arms Control: The New Guide to Negotiations and Agreements, SAGE Publications, London, 2002Google Scholar, Chap. 5.

13 Treaty on Conventional Armed Forces in Europe, 19 November 1990, available at: www.osce.org/library/14087.

14 Maya Brehm, Defending the Boundary: Constraints and Requirements on the Use of Autonomous Weapon Systems Under International Humanitarian and Human Rights Law, Geneva Academy Briefing No. 9, May 2017, pp. 15–16.

15 Richard Moyes, “Key Elements of Meaningful Human Control”, Article 36, April 2016, available at: www.article36.org/wp-content/uploads/2016/04/MHC-2016-FINAL.pdf. Article 36 is a member of the KRC.

16 ICRC, Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons, Geneva, 2016; US Department of Defense (DoD), Directive 3000.09, “Autonomy in Weapon Systems”, 2012 (amended 2017); Scharre, Paul, Army of None: Autonomous Weapons and the Future of War, W. W. Norton, New York, 2018Google Scholar.

17 Vincent Boulanin and Maaike Verbruggen, Mapping the Development of Autonomy in Weapon Systems, Stockholm International Peace Research Institute (SIPRI), Stockholm, 2017, available at: www.sipri.org/sites/default/files/2017-11/siprireport_mapping_the_development_of_autonomy_in_weapon_systems_1117_0.pdf.

18 International Panel on the Regulation of Autonomous Weapons (iPRAW), Focus on Human Control, iPRAW Report No. 5, August 2019, available at: www.ipraw.org/wp-content/uploads/2019/08/2019-08-09_iPRAW_HumanControl.pdf.

19 ICRC, above note 16, p. 7.

20 For the implications of autonomy in earlier stages of the targeting cycle, which are not discussed further here, see Arthur H. Michel, “The Killer Algorithms Nobody's Talking About”, Foreign Policy, 20 January 2020, available at: https://foreignpolicy.com/2020/01/20/ai-autonomous-weapons-artificial-intelligence-the-killer-algorithms-nobodys-talking-about/.

21 Israel Aerospace Industries, “HARPY: Autonomous Weapon for All Weather”, available at: www.iai.co.il/p/harpy. A loitering munition is a weapons system that “loiters” in an area for a prolonged period of time, waiting for targets to appear.

22 Frank Sauer, “Stopping ‘Killer Robots’: Why Now Is the Time to Ban Autonomous Weapons Systems”, Arms Control Today, Vol. 46, No. 8, 2016, pp. 8–9.

23 To give but one example, the J3016 Levels of Automated Driving standard issued by the Society of Automotive Engineers (SAE) “defines six levels of driving automation” and considers level 5 to be “full vehicle autonomy”. SAE, “SAE Standards News: J3016 Automated-Driving Graphic Update”, 7 January 2019, available at: www.sae.org/news/2019/01/sae-updates-j3016-automated-driving-graphic.

24 I owe this point to an anonymous reviewer.

25 The general notion of an action–reaction dynamic created by increasing autonomy was first described by Jürgen Altmann: “Because of very fast action and reaction, autonomous weapon systems would create strong pressures for fast attack if both opponents have got them.” Altmann, Jürgen, “Military Uses of Nanotechnology: Perspectives and Concerns”, Security Dialogue, vol. 35, no. 1, 2004, p. 63CrossRefGoogle Scholar.

26 Maya Brehm, “Targeting People”, Article 36, November 2019, available at: www.article36.org/wp-content/uploads/2019/11/targeting-people.pdf; Richard Moyes, “Autonomy in Weapons Systems: Mapping a Structure for Regulation Through Specific Policy Questions”, Article 36, November 2019, available at: www.article36.org/wp-content/uploads/2019/11/regulation-structure.pdf; Richard Moyes, “Target Profiles”, Article 36, August 2019, available at: https://t.co/HZ1pvMnIks?amp=1; iPRAW, above note 18; Vincent Boulanin, Neil Davison, Netta Goussac and Moa Peldán Carlsson, Limits on Autonomy in Weapon Systems: Identifying Practical Elements of Human Control, SIPRI and ICRC, June 2020, available at: www.sipri.org/sites/default/files/2020-06/2006_limits_of_autonomy.pdf.

27 iPRAW, above note 18, pp. 12–13.

28 Daniele Amoroso and Guglielmo Tamburrini, What Makes Human Control over Weapon Systems “Meaningful”?, International Committee for Robot Arms Control, August 2019, available at: www.icrac.net/wp-content/uploads/2019/08/Amoroso-Tamburrini_Human-Control_ICRAC-WP4.pdf.

29 I am thankful to an anonymous reviewer for this clarification.

30 E. Rosert and F. Sauer, above note 11.

31 Paul Scharre, Robotics on the Battlefield, Part II: The Coming Swarm, Center for a New American Security (CNAS), October 2014, available at: https://tinyurl.com/yy4gxs43; Maaike Verbruggen, The Question of Swarms Control: Challenges to Ensuring Human Control over Military Swarms, EU Non-Proliferation and Disarmament Consortium, Non-Proliferation and Disarmament Paper No. 65, December 2019.

32 Horowitz, Michael C., “When Speed Kills: Lethal Autonomous Weapon Systems, Deterrence and Stability”, Journal of Strategic Studies, vol. 42, no. 6, 2019CrossRefGoogle Scholar; Altmann, Jürgen and Sauer, Frank, “Autonomous Weapon Systems and Strategic Stability”, Survival, vol. 59, no. 5, 2017CrossRefGoogle Scholar.

33 Arkin, Ronald C., “Ethical Robots in Warfare”, IEEE Technology and Society Magazine, vol. 28, no. 1, 2009CrossRefGoogle Scholar; Arkin, Ronald C., “The Case for Ethical Autonomy in Unmanned Systems”, Journal of Military Ethics, vol. 9, no. 4, 2010CrossRefGoogle Scholar; Arkin, Ronald C., “Governing Lethal Behavior in Robots”, IEEE Technology and Society Magazine, vol. 30, no. 4, 2011CrossRefGoogle Scholar; United States, Implementing International Humanitarian Law in the Use of Autonomy in Weapon Systems: Working Paper Submitted by the United States of America, UN Doc. CCW/GGE.1/2019/WP.5, 28 March 2019, available at: https://tinyurl.com/y4xe7tmc.

34 Elsa B. Kania, “In Military-Civil Fusion, China Is Learning Lessons from the United States and Starting to Innovate”, The Strategy Bridge, 27 August 2019, available at: https://thestrategybridge.org/the-bridge/2019/8/27/in-military-civil-fusion-china-is-learning-lessons-from-the-united-states-and-starting-to-innovate; Elsa B. Kania, “AI Weapons” in China's Military Innovation, Brookings Institution, April 2020, available at: www.brookings.edu/wp-content/uploads/2020/04/FP_20200427_ai_weapons_kania_v2.pdf; Frank Sauer, “Military Applications of Artificial Intelligence: Nuclear Risk Redux”, in Vincent Boulanin (ed.), The Impact of Artificial Intelligence on Strategic Stability and Nuclear Risk, SIPRI, Stockholm, 2019.

35 Horowitz, Michael C., “Artificial Intelligence, International Competition, and the Balance of Power”, Texas National Security Review, vol. 1, no. 3, 2018Google Scholar, available at: https://tnsr.org/2018/05/artificial-intelligence-international-competition-and-the-balance-of-power/; Davis, Zachary, “Artificial Intelligence on the Battlefield: Implications for Deterrence and Surprise”, Prism, vol. 8, no. 2, 2019, pp. 117121Google Scholar.

36 United States, above note 33. I would like to thank an anonymous reviewer for highlighting this.

37 Shashank R. Reddy, India and the Challenge of Autonomous Weapons, Carnegie Endowment for International Peace, June 2016, p. 12, available at: https://carnegieendowment.org/files/CEIP_CP275_Reddy_final.pdf.

38 See J. Altmann, above note 25; Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons, Ashgate, Farnham, 2009, Chap. 6; Jean-Marc Rickli, Some Considerations of the Impact of LAWS on International Security: Strategic Stability, Non-State Actors and Future Prospects, presentation at CCW Meeting of Experts on LAWS, Geneva, 16 April 2015, available at: https://tinyurl.com/y4fjozpf; Paul Scharre, Autonomous Weapons and Operational Risk, CNAS Ethical Autonomy Project, Washington, DC, February 2016, available at: https://s3.amazonaws.com/files.cnas.org/documents/CNAS_Autonomous-weapons-operational-risk.pdf; Wallach, Wendell, “Toward a Ban on Lethal Autonomous Weapons: Surmounting the Obstacles”, Communications of the ACM, vol. 60, no. 5, 2017, p. 31CrossRefGoogle Scholar; Lachow, Irving, “The Upside and Downside of Swarming Drones”, Bulletin of the Atomic Scientists, vol. 73, no. 2, 2017CrossRefGoogle Scholar; J. Altmann and F. Sauer, above note 32; Paul Scharre, “Autonomous Weapons and Stability”, PhD thesis, King's College London, March 2020, available at: https://kclpure.kcl.ac.uk/portal/files/129451536/2020_Scharre_Paul_1575997_ethesis.pdf.

39 The following section draws on J. Altmann and F. Sauer, above note 32; F. Sauer, above note 34; Hansen, Aaron and Sauer, Frank, “Autonomie in Waffensystemen: Chancen und Risiken Für die US-Sicherheitspolitik”, Zeitschrift für Außen- und Sicherheitspolitik, vol. 12, no. 2, 2019CrossRefGoogle Scholar.

40 For the general argument, see Hedley Bull, The Anarchical Society: A Study of Order in World Politics, Macmillan, London, 1977. For the case of AI, see Elsa B. Kania and Andrew Imbrie, “Great Powers Must Talk to Each Other about AI”, Defense One, 28 January 2020, available at: www.defenseone.com/ideas/2020/01/great-powers-must-talk-each-other-about-ai/162686/?oref=d-river.

41 Sauer, Frank and Schörnig, Niklas, “Killer Drones: The Silver Bullet of Democratic Warfare?”, Security Dialogue, vol. 43, no. 4, 2012CrossRefGoogle Scholar; Fuhrmann, Matthew and Horowitz, Michael C., “Droning On: Explaining the Proliferation of Unmanned Aerial Vehicles”, International Organization, vol. 71, no. 2, 2017CrossRefGoogle Scholar; Gilli, Andrea and Gilli, Mauro, “The Diffusion of Drone Warfare? Industrial, Organizational, and Infrastructural Constraints”, Security Studies, vol. 25, no. 1, 2016CrossRefGoogle Scholar.

42 Defense Science Board, The Role of Autonomy in DoD Systems, 2012, pp. 69–71.

43 New America, “World of Drones”, available at: www.newamerica.org/in-depth/world-of-drones/.

44 Sydney J. Friedberg, “Robot Wars: Centaurs, Skynet, and Swarms”, Breaking Defense, 31 December 2015, available at: http://breakingdefense.com/2015/12/robot-wars-centaurs-skynet-swarms/; Thomas G. Mahnken, Technology and the American Way of War Since 1945, Columbia University Press, New York, 2008, p. 123.

45 Robert O. Work, “Robert Work Talks NATO's Technological Innovation and the DoD”, CNAS Brussels Sprouts Podcast, 11 January 2018, available at: www.cnas.org/publications/podcast/robert-work-talks-natos-technological-innovation-and-the-dod.

46 Defense Science Board, Summer Study on Autonomy, 2016, p. 45; Elsa B. Kania, Battlefield Singularity: Artificial Intelligence, Military Revolution, and China's Future Military Power, CNAS, Washington, DC, November 2017, available at: https://s3.amazonaws.com/files.cnas.org/documents/Battlefield-Singularity-November-2017.pdf?mtime=20171129235805; E. B. Kania, “In Military-Civil Fusion”, above note 34.

47 Kelley Sayler, A World of Proliferated Drones: A Technology Primer, CNAS, Washington, DC, 2015, p. 29.

48 Sebastien Roblin, “The U.S. Army Needs More Anti-Aircraft Weapons – and Fast”, War is Boring, 22 January 2018, available at: http://warisboring.com/the-u-s-army-needs-more-anti-aircraft-weapons-and-fast/.

49 David Barno and Nora Bensahel, “The Drone Beats of War: The U.S. Vulnerability to Targeted Killings”, War on the Rocks, 21 January 2020, available at: https://warontherocks.com/2020/01/the-drone-beats-of-war-the-u-s-vulnerability-to-targeted-killings/. A decapitation scenario is a scenario in which an attacker aims to destroy or destabilize an opponent's leadership and command and control structure in order to severely degrade or destroy its capacity for (nuclear) retaliation.

50 Sydney J. Friedberg, “Drones Need Secure Datalinks to Survive vs. Iran, China”, Breaking Defense, 10 August 2012, available at: http://breakingdefense.com/2012/08/drones-need-secure-datalinks-to-survive-vs-iran-china/.

51 For a critical overview, see Gary Marcus, “Deep Learning: A Critical Appraisal”, New York University, 2 January 2018, available at: https://arxiv.org/ftp/arxiv/papers/1801/1801.00631.pdf.

52 Klincewicz, Michał, “Autonomous Weapons Systems, the Frame Problem and Computer Security”, Journal of Military Ethics, vol. 14, no. 2, 2015CrossRefGoogle Scholar; Anh Nguyen, Jason Yosinski and Jeff Clune, “Deep Neural Networks Are Easily Fooled: High Confidence Predictions for Unrecognizable Images”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, pp. 427–436; Z. Davis, above note 35, pp. 121–122.

53 Ivan Evtimov et al., “Robust Physical-World Attacks on Deep Learning Models”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017, available at: https://arxiv.org/pdf/1707.08945.pdf.

54 See the by now famous example of the turtle mistaken for a rifle, in Anish Athalye, Logan Engstrom, Andrew Ilyas and Kevin Kwok, “Synthesizing Robust Adversarial Examples”, Proceedings of the 35th International Conference on Machine Learning, Vol. 80, 2018, available at: https://arxiv.org/pdf/1707.07397.pdf.

55 Defense Science Board, above note 46, p. 28; Vadim Kozyulin, “International and Regional Threats Posed by the LAWS: Russian Perspective”, PIR Center for Policy Studies, April 2016, available at: https://tinyurl.com/y4qslefc; P. Scharre, Autonomous Weapons and Operational Risk, above note 38, p. 14.

56 Melissa Hellmann, “Special Sunglasses, License-Plate Dresses: How to Be Anonymous in the Age of Surveillance”, Seattle Times, 12 January 2020, available at: www.seattletimes.com/business/technology/special-sunglasses-license-plate-dresses-juggalo-face-paint-how-to-be-anonymous-in-the-age-of-surveillance/.

57 P. Scharre, above note 38, p. 21.

58 Charles Perrow, Normal Accidents: Living with High-Risk Technologies, Basic Books, New York, 1984.

59 John Borrie, Security, Unintentional Risk, and System Accidents, United Nations Institute for Disarmament Research (UNIDIR), Geneva, 15 April 2016, available at: https://tinyurl.com/yyaugayk; P. Scharre, Autonomous Weapons and Operational Risk, above note 38.

60 P. Scharre, Autonomous Weapons and Operational Risk, above note 38, p. 13.

61 UNIDIR, The Weaponization of Increasingly Autonomous Technologies in the Maritime Environment: Testing the Waters, UNIDIR Resources No. 4, Geneva, 2015, p. 8.

62 G. Marcus, above note 51, pp. 10–11.

63 See, for the example of the Patriot missile defence system, John K. Hawley, Patriot Wars: Automation and the Patriot Air and Missile Defense System, CNAS, Washington, DC, January 2017, available at: www.cnas.org/publications/reports/patriot-wars.

64 P. Scharre, Autonomous Weapons and Operational Risk, above note 38, p. 31; Noel Sharkey and Lucy Suchman, “Wishful Mnemonics and Autonomous Killing Machines”, Proceedings of the AISB, Vol. 136, 2013, pp. 16–17.

65 Defense Science Board, above note 42, p. 15.

66 André Haider and Maria Beatrice Catarrasi, Future Unmanned System Technologies: Legal and Ethical Implications of Increasing Automation, Joint Air Power Competence Centre, November 2016, p. 10, available at: www.japcc.org/wp-content/uploads/Future_Unmanned_System_Technologies_Web.pdf; ICRC, Views of the International Committee of the Red Cross (ICRC) on Autonomous Weapon System[s], Geneva, 11 April 2016, p. 3, available at: www.icrc.org/en/download/file/21606/ccw-autonomous-weapons-icrc-april-2016.pdf.

67 Gary Shorter and Rena S. Miller, High-Frequency Trading: Background, Concerns, and Regulatory Developments, Congressional Research Service, 19 June 2014, available at: https://fas.org/sgp/crs/misc/R43608.pdf.

68 P. Scharre, Autonomous Weapons and Operational Risk, above note 38, p. 53; J. Altmann and F. Sauer, above note 32, pp. 128–132.

69 “Video: Here's How the US Air Force Is Automating the Future Kill Chain”, Defense News, 2019, available at: www.defensenews.com/video/2019/11/16/heres-how-the-us-air-force-is-automating-the-future-kill-chain-dubai-airshow-2019/.

70 Yuna H. Wong et al., Deterrence in the Age of Thinking Machines, RAND Corporation, 2020, p. xi, available at: www.rand.org/pubs/research_reports/RR2797.html.

71 Blair, Bruce G., The Logic of Accidental Nuclear War, Brookings Institution, Washington, DC, 1993, p. 181Google Scholar; Rhodes, Richard, Arsenals of Folly: The Making of the Nuclear Arms Race, Simon & Schuster, London, 2008, pp. 165166Google Scholar; Hoffman, David E., The Dead Hand: The Untold Story of the Cold War Arms Race and Its Dangerous Legacy, Doubleday, New York, 2009, pp. 611, 94–95Google Scholar; Mark Gubrud, “Stopping Killer Robots”, Bulletin of the Atomic Scientists, Vol. 70, No. 1, 2014; Michael C. Horowitz, Paul Scharre and Alexander Velez-Green, A Stable Nuclear Future? The Impact of Autonomous Systems and Artificial Intelligence, working paper, December 2019, pp. 13–14, available at: https://arxiv.org/ftp/arxiv/papers/1912/1912.05291.pdf; Paul Scharre, “A Million Mistakes a Second”, Foreign Policy, 12 September 2018, available at: https://foreignpolicy.com/2018/09/12/a-million-mistakes-a-second-future-of-war/.

72 M. Horowitz, P. Scharre and A. Velez-Green, above note 71, p. 14; Philip Reiner and Alexa Wehsner, “The Real Value of Artificial Intelligence in Nuclear Command and Control”, War on the Rocks, 4 November 2019, available at: https://warontherocks.com/2019/11/the-real-value-of-artificial-intelligence-in-nuclear-command-and-control/. On the resulting cyber vulnerabilities, see Johnson, James, “The AI–Cyber Nexus: Implications for Military Escalation, Deterrence and Strategic Stability”, Journal of Cyber Policy, vol. 4, no. 3, 2019CrossRefGoogle Scholar.

73 With the exception of Adam Lowther and Curtis McGiffin, “America Needs a ‘Dead Hand’”, War on the Rocks, 16 August 2019, available at: https://warontherocks.com/2019/08/america-needs-a-dead-hand/.

74 Edward Geist and Andrew J. Lohn, How Might Artificial Intelligence Affect the Risk of Nuclear War?, RAND Corporation, 2018, available at: www.rand.org/content/dam/rand/pubs/perspectives/PE200/PE296/RAND_PE296.pdf; Vincent Boulanin, Lora Saalman, Petr Topychkanov, Fei Su and Moa Peldán Carlsson, Artificial Intelligence, Strategic Stability and Nuclear Risk, SIPRI, Stockholm, June 2020, available at: www.sipri.org/sites/default/files/2020-06/artificial_intelligence_strategic_stability_and_nuclear_risk.pdf.

75 James M. Acton (ed.), Entanglement: Chinese and Russian Perspectives on Non-Nuclear Weapons and Nuclear Risks, Carnegie Endowment for International Peace, 2017, p. 1, available at: http://carnegieendowment.org/files/Entanglement_interior_FNL.pdf.

76 Sebastian Brixey-Williams, “Will the Atlantic Become Transparent?”, November 2016, available at: https://britishpugwash.org/wp-content/uploads/2016/11/Will-the-Atlantic-become-transparent-.pdf.

77 DoD, Nuclear Posture Review 2018, 2018, p. 21, available at: https://tinyurl.com/yc7lu944.

78 CCW Meeting Final Report, above note 2, p. 10.

79 Frank Sauer, Daniele Amoroso, Noel Sharkey, Lucy Suchman and Guglielmo Tamburrini, Autonomy in Weapon Systems: The Military Application of Artificial Intelligence as a Litmus Test for Germany's New Foreign and Security Policy, Heinrich Böll Foundation Publication Series on Democracy, Vol. 49, 2018, pp. 23–32, available at: www.boell.de/sites/default/files/boell_autonomy-in-weapon-systems_v04_kommentierbar_1.pdf.

80 The following section draws on Elvira Rosert and Frank Sauer, “Prohibiting Autonomous Weapons: Put Human Dignity First”, Global Policy, Vol. 10, No. 3, 2019.

81 Christof Heyns, Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, UN Doc. A/HRC/23/47, 2013, p. 17, available at: https://digitallibrary.un.org/record/755741/files/A_HRC_23_47-EN.pdf.

82 Asaro, Peter, “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making”, International Review of the Red Cross, vol. 94, no. 886, 2012CrossRefGoogle Scholar; Sparrow, Robert, “Robots and Respect: Assessing the Case Against Autonomous Weapon Systems”, Ethics & International Affairs, vol. 30, no. 1, 2016CrossRefGoogle Scholar.

83 KRC, Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban, 2016, pp. 21–25.

84 ICRC, Ethics and Autonomous Weapon Systems: An Ethical Basis for Human Control?, Geneva, 3 April 2018, available at: www.icrc.org/en/download/file/69961/icrc_ethics_and_autonomous_weapon_systems_report_3_april_2018.pdf.

85 Sharkey, Amanda, “Autonomous Weapons Systems, Killer Robots and Human Dignity”, Ethics and Information Technology, vol. 21, no. 2, 2019CrossRefGoogle Scholar.

86 Deane-Peter Baker, “The Awkwardness of the Dignity Objection to Autonomous Weapons”, The Strategy Bridge, 6 December 2018, available at: https://thestrategybridge.org/the-bridge/2018/12/6/the-awkwardness-of-the-dignity-objection-to-autonomous-weapons.

87 I would like to thank an anonymous reviewer for specifying these properties of civilian-ness and proportionality.

88 F. Sauer et al., above note 79, p. 33.

89 Roff, Heather M., “The Strategic Robot Problem: Lethal Autonomous Weapons in War”, Journal of Military Ethics, vol. 13, no. 3, 2014, p. 219CrossRefGoogle Scholar.

90 Heyns, Christof, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life: An African Perspective”, South African Journal on Human Rights, vol. 33, no. 1, 2017, pp. 6263CrossRefGoogle Scholar.

91 See text related to above note 27.

92 Peter Asaro, “Jus Nascendi, Robotic Weapons, and the Martens Clause”, in Ryan Calo, A. Michael Froomkin and Ian Kerr (eds), Robot Law, Edward Elgar, Cheltenham, 2016, p. 385.

93 F. Sauer and N. Schörnig, above note 41; Sarah E. Kreps, “Just Put It on Our Tab: War Financing and the Decline of Democracy”, War on the Rocks, 28 May 2018, available at: https://warontherocks.com/2018/05/just-put-it-on-our-tab-21st-century-war-financing-and-the-decline-of-democracy/.

94 Denise Garcia, “Killer Robots: Toward the Loss of Humanity”, Ethics and International Affairs, April 2015, available at: www.ethicsandinternationalaffairs.org/2015/killer-robots-toward-the-loss-of-humanity/.

95 DoD, above note 16.

96 E. B. Kania, Battlefield Singularity, above note 46.

97 KRC, “Global Poll Shows 61% Oppose Killer Robots”, 22 January 2019, available at: www.stopkillerrobots.org/2019/01/global-poll-61-oppose-killer-robots/.

98 Open Roboethics Institute, “The Ethics and Governance of Lethal Autonomous Weapons Systems: An International Public Opinion Poll”, 9 November 2015, available at: www.openroboethics.org/wp-content/uploads/2015/11/ORi_LAWS2015.pdf.

99 Heather M. Roff, “What Do People Around the World Think about Killer Robots?”, Slate, 8 February 2017, available at: https://slate.com/technology/2017/02/what-do-people-around-the-world-think-about-killer-robots.html.

100 KRC, above note 97.

101 KRC, “New European Poll Shows Public Favour Banning Killer Robots”, 13 November 2019, available at: www.stopkillerrobots.org/2019/11/new-european-poll-shows-73-favour-banning-killer-robots/.

102 KRC, above note 97.

103 A. Sharkey, above note 85, p. 9.

104 Carpenter, R. Charli, “Lost” Causes: Agenda Vetting in Global Issue Networks and the Shaping of Human Security, Cornell University Press, Ithaca, NY, 2014, pp. 88121CrossRefGoogle Scholar.

105 Goose, Stephen D. and Wareham, Mary, “The Growing International Movement Against Killer Robots”, Harvard International Review, vol. 37, no. 4, 2016Google Scholar, available at: www.jstor.org/stable/26445614.

106 CCW Meeting Final Report, above note 2, p. 10.

107 For the notion of codifying human control as a principle of IHL in general, see Elvira Rosert, How to Regulate Autonomous Weapons, PRIF Spotlight 6/2017, Peace Research Institute Frankfurt, 2017, available at: www.hsfk.de/fileadmin/HSFK/hsfk_publikationen/Spotlight0617.pdf.

108 V. Boulanin et al., above note 26. See also Ilse Verdiesen, Filippo Santoni de Sio and Virginia Dignum, “Accountability and Control over Autonomous Weapon Systems: A Framework for Comprehensive Human Oversight”, Minds and Machines, 2020, available at: https://link.springer.com/article/10.1007/s11023-020-09532-9.

109 Moyes, “Target Profiles”, above note 26.

110 For this general approach as well as a list of variables to consider, see V. Boulanin et al., above note 26, pp. 30–33.

111 F. Sauer et al., above note 79, pp. 42–45.

112 Z. Davis, above note 35, p. 122.

113 Sauer, Frank, Atomic Anxiety: Deterrence, Taboo and the Non-Use of U.S. Nuclear Weapons, Palgrave Macmillan, London, 2015, pp. 9192Google Scholar.

114 Robert H. Latiff and Patrick J. McCloskey, “With Drone Warfare, America Approaches the Robo-Rubicon”, Wall Street Journal, 14 March 2013, available at: https://tinyurl.com/y2t7odsh.