Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-11-23T19:59:20.076Z Has data issue: false hasContentIssue false

Mapping the Lethal Autonomous Weapons Debate: An Introduction

Published online by Cambridge University Press:  01 December 2023

Josephine Jackson*
Affiliation:
University of St Andrews, St Andrews, Scotland ([email protected])
Rights & Permissions [Opens in a new window]

Abstract

The UN Convention on Certain Conventional Weapons (CCW) can, on the one hand, be considered vital for the global governance process—in the sense of urging international cooperation on the ethical, developmental, and standards aspects of lethal autonomous weapon systems (LAWS). On the other hand, the CCW may also embody a global trend that does not augur well for international solidarity, namely the lack of credible and comprehensive collaboration to advance global objectives of peace and security. In 2022, a majority of the 125 nations that belong to the CCW requested limits on a specific type of lethal autonomous weapons: “killer robots.” Yet, most of the major global powers—namely the United States, Russia, and China—opposed not only a ban on LAWS but also on any restrictions on the development of these weapons, not least because the United States, Russia, and China are actively developing this weapons technology. While there is currently much focus on the technological evolution of LAWS, less has been written about how ethical values can exert influence on a growing global consciousness around factors such as power, technology, human judgment, accountability, autonomy, dehumanization, and the use of force. This introduction lays the groundwork for dealing with these issues. It does so by showing that all these factors warrant a pluralist approach to the global governance of LAWS, based on multiple grounds, including the military, tech, law, and distinctive theoretical-ethical orientations; the rationale being to combine this expertise into a collection for publication. Reflecting the contributing authors’ firsthand experiences of the ethics surrounding the management of LAWS to address decisive and critical questions at an expert level, it provides a framing for the collection, showing that the use of international legal mechanisms like the CCW are crucial to considering both the potential and the limits of LAWS, as well as what it can contribute to areas such as international law, human rights, and national security.

Type
Roundtable: Global Governance and Lethal Autonomous Weapon Systems
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press on behalf of Carnegie Council for Ethics in International Affairs

The UN Convention on Certain Conventional Weapons (CCW) first entered into force in 1983.Footnote 1 The CCW seeks to prohibit or restrict the use of certain conventional weapons whose effects are indiscriminate and/or excessively injurious, such as land mines, booby traps, incendiary devices, blinding laser weapons, and explosive remnants of war.Footnote 2 There are five protocols (restrictions) to the CCW: (1) restrictions on weapons with non-detectable fragments; (2) restrictions on landmines and booby traps; (3) restrictions on incendiary weapons; (4) restrictions on blinding laser weapons; and (5) best practices for the clearance of explosive remnants of war.Footnote 3 By the end of 2020, there were 125 state parties to the convention, with some of those countries having only adopted some of the five protocols.Footnote 4 Since 2017, efforts have been underway to open negotiations in two areas: (1) ways to add a regulatory mechanism to ensure that parties to the convention are honoring their commitments; and (2) expanding the scope of the convention to restrict things such as over-sized caliber bullets, anti-vehicle mines, and lethal autonomous weapons. As of 2022, a majority of the 125 nations that belong to the convention said they wanted curbs on a specific type of lethal autonomous weapons known colloquially as “killer robots.” But most of the major global powers—namely, the United States, Russia, and China—opposed a ban not only on lethal autonomous weapons but also on any restrictions on the development of these weapons, not least because the United States, Russia, and China are actively developing this weapons technology. The United States’ position is that existing international laws are sufficient and banning autonomous weapon technology would be premature. The United States has instead proposed a nonbinding code of conduct regulating activities in the area of lethal autonomous weapons. Critics, such as Human Rights Watch and Article36, a non-profit organization focused on reducing harm from weapons systems, have dismissed it as ineffective and incapable of acting at the scale of the problem.Footnote 5

The topic at issue in this roundtable is lethal autonomous weapon systems (LAWS), which is related to new developments in warfighting, armed conflicts, and weapons technologies, as well as associated concepts such as artificial intelligence (AI), autonomous weapon systems (AWS), and machine learning (ML). The publications to date on LAWS, AI, ML, and associated ethical topics are detailed and voluminous. Yet, the fact that these systems are still incomplete and can in principle be shaped politically, means that their complex capabilities and possibilities are expected to remain intensely contested for quite some time. Evidence for this can be seen in the following policy analyses appearing in this collection, all of which offer different perspectives from the tech, law, academia, and military points of view. The essays were written by leading experts in these fields, offering insights that could contribute to the governance of LAWs at all levels: locally, nationally, and globally.

Can and should the international community use international legal means and/or institutions, such as the CCW, to regulate lethal autonomous weapons, or turn to other major legal instruments? Conversely, can and should the international community rely instead on national political systems and military rules to regulate lethal autonomous weapons? As a result of the global proliferation implications of lethal autonomous weapons, do we need new rules for intelligence collection and analysis? What are the most pressing ethical concerns surrounding the governance of lethal AI? These are the questions we asked our contributing authors, and in their responses they identified the normative and operational underpinnings of LAWS and engaged them in intense discussion.

The contributing authors were deliberately selected based on their recognized standing and professional experience in the military, legal, tech, and academic sectors respectively, to present different facets of LAWS, which we hope will encourage mutual learning and enhance the current debates on these issues. As the research, development, and marketing on LAWS is vast and growing, this introduction provides readers with a brief overview of the main concepts, practices, and moral quandaries that our contributing authors have worked on, researched, and, in some cases, continue to grapple with. The contributions are organized into five different essays, each with its own unique perspective: military; tech/industry; academia specializing in law; academia specializing in ethics; and academia specializing in global constitutionalism and global institutions. This introduction concludes by noting the implications that collective governance—or the lack thereof—of LAWS has for international society.

A Pluralist Approach to LAWS

Since power and dominance (primarily militarily), as well as the ethical weight of what those concepts entail, are at the root of the story of LAWS, this collection starts from the perspective of the U.S. Department of Defense (DoD). We asked U.S. Air Force Lieutenant General (ret.) David Deptula, the Dean of the Mitchell Institute for Aerospace Studies, to consider how functions of the DoD and its major components, including commanders of the eleven unified Combatant Commands,Footnote 6 achieve the strategic objectives of the U.S. National Security Strategy with respect to autonomy in weapons systems. Deptula argues that AWS have the potential to make the U.S. military both a more effective and a more ethical fighting force, better able to achieve objectives while hewing more closely to the tenets of just war. He believes that the algorithms for AWS can be developed to be better than humans at avoiding collateral damage and misinterpretation on the battlefield. Deptula concludes his essay by identifying several key steps, short of international bans or limiting conventions, that could be taken to regulate (or restrain) LAWS—the most effective being, in his view, building a credible deterrent.

In their essay, Arun Seraphin and Wilson Miles bring an emerging technological perspective, partly rooted in their current work in the Emerging Technologies Institute at the National Defense Industrial Association, to bear on bridging the military, policy, and technical communities insofar as it relates to the use and regulation of LAWS. On the one hand, Seraphin and Miles agree with Deptula that AI and machine learning (ML) can improve warfighting and operational techniques. Yet, on the other hand, Seraphin and Miles do not shy away from identifying several important technical issues inherent to autonomous defense systems that will present complex policy and ethical challenges to stakeholders across national frontiers. Some examples that are most relevant in the matter of global governance of LAWS include “[how] systems deal with compromised, biased and synthetic datasets; the level and manner of human interaction with these systems, both controlling their behaviors and teaming with them to execute missions; [and] the predictability and reliability of the systems, especially when dealing with rare or unexpected situations.”Footnote 7 The fact that there is currently neither agreement on what constitutes “autonomy” in weapons systems nor any clarity about regulation in these situations, remains the common thread running through these examples. In response, Seraphin and Miles conclude with four recommendations for action by the U.S. government, based on the current state of technology and expertise as well as the authors’ evaluation of prior incongruencies between the military, technical, and policy communities.

Mary Ellen O'Connell, the Robert & Marion Short Professor of Law and Professor of International Peace Studies at the University of Notre Dame, in her contribution offers a fundamentally critical view of LAWS. Incorporating an understanding of military force as “defending the role of law, not superseding it,”Footnote 8 O'Connell advances a compelling argument for banning autonomous weapons as a legal and ethical mandate. Four main elements underpin her reasoning: a rejection of political realism in favor of legality; the incompatibility of LAWS with prevailing normative standards; the challenges of applying natural law principles to particular cases; and a deep skepticism toward the technological promises.

Esther D. Reed, a professor of theological ethics in the department of theology and religion at the University of Exeter, picks up on the theme of human dignity highlighted by O'Connell and expands upon it, highlighting the need to implement an accountability principle in the taking of human life with autonomous weapon systems at war. Reed questions whether these new weapons technologies essentially entail “inherently evil decisions and actions,” thus substantiating the need for civil counterbalances such as human responsibility and supervision.Footnote 9 Much like O'Connell, Reed's main contention is that human dignity vis-à-vis international human rights law and international humanitarian law are at the root of LAWS’ weaknesses and shortcomings. Without the requirement of a constant commitment to accountability for the use of force, Reed argues, the prospects inherent to “justice in war and sub-war contexts are undermined… catastrophically.”Footnote 10 Yet, there are steps for protecting these values. One example is what Reed calls “the whole picture” approach.Footnote 11 This entails taking the entire image of, for instance, a certain political decision of a government and considering how the aspects of that decision affect not just the life cycle of a particular weapons system but also how “the many ethical aspects impinge.”Footnote 12 In short, accountability in AWS is a byword for justice—because you cannot have one without the other.

Both the complexity and significance of the law and ethics approach taken by O'Connell and Reed are complemented by another perspective on ethical efforts to regulate LAWS—namely a virtues-based account, as proposed by our fifth and final contributor, Anthony Lang, Jr., chair in international political theory in the School of International Relations at the University of St Andrews. Lang's argument is built in two parts: (1) showing how rules and consequences-based accounts “fail to provide adequate guidance for how to deal [with LAWS]”; and (2) providing a defense of an Aristotelian framework that might better achieve the international community's objective of regulating LAWS.Footnote 13 Lang begins by examining previous prohibitions on weapons of war, such as the medieval church's attempt to forbid the crossbow and the ban on chemical weapons prior to World War I.Footnote 14 These efforts produced mixed results, and one of the lessons from these examples that Lang takes away is that regulating weapons “requires something more than formal declarations.”Footnote 15 Lang's argument is that the concepts associated with the tradition of virtue ethics—which, notwithstanding legitimate critiques and commentaries—impart useful knowledge about the development of rules and laws and, in turn, their relationship to any attempt by the international polity to regulate LAWS. Drawing on the work of Daniele Amoroso and Guglielmo Tamburrini, Lang summarizes three principles by which we should evaluate LAWS: (1) humans must always be the key agent in any LAWS application; (2) systems must be designed to always ensure human accountability throughout the operational process; and (3) humans must be “moral agency enactors”—or, in other words, humans must act to “protect the dignity of those subject to the use of such weapons systems.”Footnote 16 Yet Lang is cautious that even with such guidelines, the international community cannot expect smooth decision-making in this regard, especially given the rapid advances in technology and science and the many associated ethical and legal challenges.

CONCLUSION

Regulation, noted British politician and businessman Michael Heseltine, is “what separates the law of the jungle from civilized society.”Footnote 17 To be sure, Heseltine was referring to the post-Brexit economic climate in the U.K.; yet this logic applies similarly to our assumptions about the policy principles of ethical governance of LAWS. We live in a world in which politicians, technological experts, AI stakeholders, and the global citizenry face a multitude of complex challenges and circumstances that do not fit smoothly into existing social and ethical frameworks. As our contributing authors make clear, when progress has been made in relation to LAWS-related issues, such as in the form of the CCW, it has often occurred by taking two steps forward, one step back. The challenge now is to build on these promising first steps, with a view to effectively managing all dimensions of LAWS—legal, professional, ethical, and normative—in ways that improve adherence to the provisions of international law, and in particular to international humanitarian and human rights law.

References

Notes

1 United Nations Office for Disarmament Affairs, “Conventional Arms,” (Conventional Arms – UNODA, 2023). Conventional weapons are “weapons other than weapons of mass destruction. [They can] include battle tanks, armored combat vehicles, large-caliber artillery systems, combat aircraft and unmanned combat aerial vehicles (UCAV), attack helicopters, warships, missile and missile launchers, landmines, cluster munitions, small arms, and lights weapons and ammunition.”

2 United Nations Office for Disarmament Affairs, Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, 2022, treaties.unoda.org/t/ccwc.

3 Ibid.

4 Countries must adopt a minimum of two protocols to be considered a party to the Convention.

5 Human Rights Watch, “U.S.: New Policy on Autonomous Weapons Flawed,” 2023, www.hrw.org/news/2023/02/14/us-new-policy-autonomous-weapons-flawed. Richard Moyes, “Autonomous weapons: targeting people should be prohibited,” Article36, 2019, article36.org/updates/anti-personnel-prohibition/.

6 Africa Command; Central Command; Cyber Command; European Command; Indo-Pacific Command; Northern Command; Southern Command; Space Command; Special Operations Command; Strategic Command; and Transportation Command.

7 Seraphin, Arun and Miles, Wilson, “Toward a Balanced Approach: Bridging the Military, Policy, and Technical Communities,” Ethics & International Affairs 37, no. 3 (2023)Google Scholar.

8 O'Connell, Mary Ellen, “Banning Autonomous Weapons: A Legal and Ethical Mandate,” Ethics & International Affairs 37, no. 3 (2023), p. 1Google Scholar. Emphasis added.

9 Reed, Esther D., “Accountability for the Taking of Human Life with LAWS at War,” Ethics & International Affairs 37, no. 3 (2023)Google Scholar.

10 Ibid.

11 Ibid.

12 Ibid.

13 Lang, Anthony, “Regulating Weapons: An Aristotelian Account,” Ethics & International Affairs 37, no. 3 (2023)Google Scholar.

14 Ibid.

15 Ibid.

16 Ibid.

17 Michael Heseltine, “Michael Heseltine: Fifty Years Ago, We Joined the EU—Today, I Deplore the Deception of Brexit,” January 1, 2023, uk.news.yahoo.com/voices-michael-heseltine-fifty-years-103419114.html.