Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2024-12-20T10:19:48.803Z Has data issue: false hasContentIssue false

THE COMPATIBILITY OF AUTONOMOUS WEAPONS WITH THE PRINCIPLE OF DISTINCTION IN THE LAW OF ARMED CONFLICT

Published online by Cambridge University Press:  07 October 2020

Elliot Winter*
Affiliation:
Lecturer, Newcastle University Law School, [email protected].

Abstract

The law of armed conflict requires ‘distinction’ between civilians and combatants and provides that only the latter may be targeted. However, for proper implementation, distinction requires advanced observation and recognition abilities as well as the capacity to exercise judgement based on situational awareness. While the observation and recognition abilities of machines may now surpass those of humans, the capacity of machines to exercise judgement remains significantly more limited than our own. Consequently, this article contends that the deployment of ‘autonomous weapons’ based on current levels of technological sophistication would be incompatible with distinction and that, as such, their use in conflict would be unlawful.

Type
Articles
Copyright
Copyright © The Author(s) 2020. Published by Cambridge University Press for the British Institute of International and Comparative Law

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

A debt of gratitude is owed to Dr Conall Mallory and Dr Elena Katselli for their support.

References

1 Ulgen, O, ‘Human Dignity in an Age of Autonomous Weapons: Are We in Danger of Losing an “Elementary Consideration of Humanity”?’ (2017) 17 Baltic Yearbook of International Law 167Google Scholar.

2 Amoroso, D and Giordano, B, ‘Who Is to Blame for Autonomous Weapons Systems' Misdoings?’ in Carpanelli, E and Lazzerini, N (eds), Use and Misuse of New Technologies: Contemporary Challenges in International and European Law (Springer 2019) 211CrossRefGoogle Scholar.

3 Singer, PW, Wired for War: The Robotics Revolution and Conflict in the 21st Century (Penguin 2009) 67Google Scholar.

4 Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts (adopted 08 June 1977, entered into force 7 December 1978) 1125 UNTS 3 art 48.

5 Protocol Additional to the Geneva Conventions of 12 August 1949 and Relating to the Protection of Victims of Non-International Armed Conflicts (adopted 08 June 1977, entered into force 7 December 1978) 1125 UNTS 609 art 13(2).

6 Kolb, R, Advanced Introduction to International Humanitarian Law (Edward Elgar 2014) 78CrossRefGoogle Scholar.

7 Winter, E, ‘Pillars not Principles: The Status of Humanity and Military Necessity in the Law of Armed Conflict’ (2020) 25 JC&SL 1Google Scholar.

8 Legality of the Threat or Use of Nuclear Weapons (Advisory Opinion) [1996] ICJ Rep 226 para 78.

9 United Kingdom Ministry of Defence, The Manual of the Law of Armed Conflict (Oxford University Press 2004) 21.

10 Danish Ministry of Defence, Military Manual on International Law Relevant to Danish Armed Forces in International Operations (Defence Command Denmark 2016) 145–55.

11 New Zealand Defence Force, Manual of Armed Forces Law, vol 4 (DM69, 2nd edn, New Zealand Defence Force 2019) 4.6.1.

12 Solis, GD, The Law of Armed Conflict (2nd edn, Cambridge University Press 2016) 309CrossRefGoogle Scholar.

13 Kolb (n 6) 77.

14 Pictet, J, Development and Principles of International Humanitarian Law (Martinus Nijhoff 1985)Google Scholar.

15 Statute of the International Court of Justice (adopted 26 June 1945, entered into force 24 October 1945) UKTS 67 (1946) art 38(1)(b).

16 Henckaerts, JM and Doswald-Beck, L, Customary International Humanitarian Law, Volume I: Rules (Cambridge University Press 2005)CrossRefGoogle Scholar.

17 ibid 3.

18 Matheson, MJ, ‘The United States Position on the Relation of Customary International Law to the 1977 Protocols Additional to the 1949 Geneva Conventions’ (1987) 2 American University Journal of International Law and Policy 419Google Scholar.

19 Schmitt, MN and Widmar, E, ‘The Law of Targeting’ in Ducheine, PAL et al. (eds), Targeting: The Challenges of Modern Warfare (Springer 2016) 121Google Scholar.

20 United States Department of Defense, ‘Autonomy in Weapons Systems’ (2012) Directive 3000.09, Glossary Part II <https://bit.ly/2UCP4fc>.

21 S Casey-Maslen, ‘Pandora's Box? Drone Strikes Under jus ad bellum, jus in bello, and International Human Rights Law’ (2012) 94/886 International Review of the Red Cross 597.

22 International Committee of the Red Cross, ‘Autonomous Weapon Systems - Q&A’ (ICRC, 12 November 2014) <http://bit.ly/2ixib2p>.

23 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (adopted 10 October 1980, entered into force 2 December 1983) 1342 UNTS 137.

24 Convention on Conventional Weapons, ‘Meeting of the High Contracting Parties: Final Report’ (16 December 2013) UN Doc CCW/MSP/2013/10 para 32.

25 Group of Governmental Experts, ‘Report of the 2018 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems’ (23 October 2018) UN Doc CCW/GGE.1/2018/3, Annex III (Chair's Summary) para 2.

26 ibid paras 2 and 5.

27 United Kingdom, ‘Statement to the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems’ Plenary Meeting of the Group of Governmental Experts (25–29 March 2019) para 3.

28 United States Department of Defense (n 20).

29 France, ‘Characterization of a LAWS’ Informal Meeting of the Group of Governmental Experts (11–15 April 2016).

30 United Kingdom, ‘Working towards a Definition of LAWS’ Informal Meeting of the Group of Governmental Experts (11–15 April 2016) para 4.

31 Group of Governmental Experts (n 25), Annex III (Chair's Summary) para 6.

32 ibid.

33 Defense Science Board, Task Force Report: The Role of Autonomy in DoD Systems (United States Department of Defense 2012) 21 and 59.

34 Bradshaw, JM et al. , ‘The Seven Deadly Myths of “Autonomous Systems”’ (2013) 28 IEEE Intelligent Systems 54, 54CrossRefGoogle Scholar.

35 Suchman, L and Weber, J, ‘Human–Machine Autonomies’ in Bhuta, N et al. (eds), Autonomous Weapons Systems: Law, Ethics, Policy (Cambridge University Press 2016)Google Scholar.

36 Van Rompaey, L, ‘Shifting from Autonomous Weapons to Military Networks’ (2019) 10 Journal of International Humanitarian Legal Studies 111, 111CrossRefGoogle Scholar.

37 ibid 115.

38 United Kingdom (n 27) para 4.

39 M Tegmark, Life 3.0: Being Human in the Age of Artificial Intelligence (Allen Lane 2017).

40 Raytheon Missiles & Defense, ‘Phalanx Weapon System’ (Raytheon Missiles & Defense) <https://bit.ly/2UEy4Fw>.

41 Raytheon Missiles & Defense, ‘Iron Dome System and SkyHunter Missile’ (Raytheon Missiles & Defense) <https://bit.ly/3dTYTNz>.

42 Dodaam Systems, ‘Super aEgis II: The Best Mobile Remote Controlled Weapon Station’ (Dodaam Systems) <http://bit.ly/2G0Hlhi>.

43 BAE Systems, ‘Taranis’ (BAE Systems) <http://bit.ly/2uamk2j>.

44 Boston Dynamics, ‘Atlas’ (Boston Dynamics) <http://bit.ly/2sP5pwi>.

45 Group of Governmental Experts (n 25) para 28(a).

46 ibid, para 28(b).

47 ibid, para 28(c).

48 ibid, para 28(d).

49 Gowan, R, ‘Muddling Through to 2030: The Long Decline of International Security Cooperation’ (2018) 42 The Fletcher Forum of World Affairs 55Google Scholar.

50 Additional Protocol I (n 4) art 36.

51 Singer (n 3) 67.

52 Additional Protocol I (n 4) art 57(2)(a)(i).

53 MN Schmitt, ‘Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics’ (2013) Harvard National Security Journal 23 <https://bit.ly/2WCnpwb>.

54 JS Thurnher, ‘Feasible Precautions in Attack and Autonomous Weapons’ in WH von Heinegg, R Frau and T Singer (eds), Dehumanization of Warfare: Legal Implications of New Weapon Technologies (Springer 2018) 109.

55 The Physics arXiv Blog, ‘Neural Net Learns Breakout Then Thrashes Human Gamers’ (The Physics arXiv Blog, 23 December 2013) para 9 <https://bit.ly/2swuqCf>.

56 ibid, para 2.

57 ibid, para 9.

58 V Mnih et al., ‘Playing Atari with Deep Reinforcement Learning’ (2013) Cornell University arXiv <https://bit.ly/38hI8b8>.

59 ibid 2 and 7.

60 ibid 8.

61 ibid 5–6.

62 GRASP, ‘Research Projects’ (GRASP) <https://bit.ly/2xJV7pw>.

63 S Crowe, ‘Exyn Drone Maps Inactive Mine on the Fly’ (The Robot Report, 19 November 2019) para 3 <https://bit.ly/35R3YRo>.

64 ibid, para 10.

65 ibid, para 9.

66 VR Leotaud, ‘Exyn Technologies Introduces Robots into Dundee Precious Metals’ Gold Mines’ Mining.Com (28 February 2019) <https://bit.ly/2vCBEWO>.

67 ibid, para 4.

68 ibid, para 5.

69 Nanalyze, ‘How Autonomous Drone Flights Will Go Beyond Line of Sight’ (Nanalyze, 31 December 2019) <https://bit.ly/3aGR7o8>.

70 United States, ‘Electronic Code of Federal Regulations’ Title 14 Chapter I Subchapter F Part 107.

71 Nanalyze, ‘7 Startups Using Drones for Inspections & Monitoring’ (Nanalyze, 18 July 2017) <https://bit.ly/2R7eJuo>.

72 Airbus, ‘Airbus Demonstrates First Fully Automatic Vision-Based Take-Off’ (Airbus, 16 January 2020) para 1 <https://bit.ly/2vjMWyK>.

73 GH Hunt, ‘The Evolution of Fly-By-Wire Control Techniques in the UK’ (1979) 83 The Aeronautical Journal 165.

74 International Telecommunication Union, Radio Regulations (International Telecommunication Union 2012) 16 (art 1.104).

75 Patriot One Technologies, ‘About’ (Patriot One Technologies) <https://bit.ly/3dT3UWH>.

76 ibid, para 3.

77 M Rocque and G Duwe, ‘Rampage Shootings: An Historical, Empirical, and Theoretical Overview’ (2018) 19 Current Opinion in Psychology 28, 30.

78 Patriot One Technologies, ‘Introducing the PatScan Multi-Sensor Covert Threat Detection Platform’ (Patriot One Technologies) <https://bit.ly/2x4Jucx>.

79 Patriot One Technologies (n 75).

80 ibid.

81 ibid.

82 ibid.

83 Patriot One Technologies, ‘Patriot One Wins Best in Category at Security Industry Association Event at ISC West’ (Patriot One Technologies, 4 June 2017) <https://bit.ly/2TIqTMl>.

84 Airsoc, ‘Where Hazards Lurk’ (Airsoc, January 2020) para 5 <https://bit.ly/2Tzvcb5>.

85 ibid para 3.

86 ibid para 6.

87 ibid para 2.

88 Nanalyze (n 69) para 20.

89 W Hays Parks, ‘Special Forces’ Wear of Non-Standard Uniforms’ (2003) 4 Chicago Journal of International Law 493, 542.

90 M Grant and T Huntley, ‘Legal Issues in Special Operations’ in G Corn et al. (eds), US Military Operations: Law, Policy and Practice (Oxford University Press 2016) 589.

91 Convention (III) Relative to the Treatment of Prisoners of War (adopted 12 August 1949, entered into force 21 October 1950) 75 UNTS 135 art 4A(2).

92 Convention (IV) Respecting the Laws and Customs of War on Land (adopted 18 October 1907, entered into force 26 January 1910) 187 CTS 227, Annex on Regulations Concerning the Laws and Customs of War on Land art 1(2).

93 I Gillich, ‘Illegally Evading Attribution? Russia's Use of Unmarked Troops in Crimea and International Humanitarian Law’ (2015) 48 Vanderbilt Journal of Transnational Law 1215.

94 Additional Protocol I (n 4) art 44(3).

95 Gillich (n 93) 1215.

96 Additional Protocol I (n 4) art 44(7).

97 Hays Parks (n 89) 542.

98 Geneva Convention III (n 91) art 4A(2).

99 Additional Protocol I (n 4) art 44(3).

100 ibid.

101 S Walker, ‘Russian Takeover of Crimea Will Not Descend into War, Says Vladimir Putin’ The Guardian (4 March 2014) <https://bit.ly/2UK8peT>.

102 M Lipman, ‘Putin's Crisis Spreads’ The New Yorker (8 March 2014) <https://bit.ly/2vsPf2u>.

103 R Heinsch, ‘Conflict Classification in Ukraine: The Return of the “Proxy War”?’ (2015) 91 International Law Studies 323, 328.

104 Gillich (n 93) 1208.

105 SR Reeves and D Wallace, ‘The Combatant Status of the “Little Green Men” and Other Participants in the Ukraine Conflict’ (2015) 91 International Law Studies 361, 394.

106 Grant and Huntley (n 90) 594.

107 Reeves and Wallace (n 105) 394.

108 Gillich (n 93) 1213.

109 ibid.

110 Hays Parks (n 89) 542.

111 Human Rights Watch, ‘Questions and Answers: Russia, Ukraine, and International Humanitarian and Human Rights Law’ (Human Rights Watch, 21 March 2014) <https://bit.ly/3bzcqJl>.

112 Gillich (n 93) 1212.

113 S Raviv, ‘The Secret History of Facial Recognition’ Wired (21 January 2020) <https://bit.ly/2U4jYxa>.

114 ibid, para 16.

115 ibid, para 49.

116 H Zuo, L Wang and J Qin, ‘XJU1: A Chinese Ethnic Minorities Face Database’ (2017) IEEE <https://bit.ly/37KhFDL>.

117 D Byler, ‘Ghost World’ Logic (1 May 2019) para 3 <https://bit.ly/2GsLAE7>.

118 B Read and R Walters, ‘China: Do the Uighurs Represent a Serious Threat?’ (2019) James Madison University Scholarly Commons <https://bit.ly/30ZI5Pb>.

119 J Honovich, ‘Hikvision's Minority Analytics’ IPVM (8 May 2018) <https://bit.ly/2RyG6xZ>.

120 L Chutel, ‘China is Exporting Facial Recognition Software to Africa, Expanding its Vast Database’ Quartz Africa (25 May 2018) <https://bit.ly/2RUSHL4>.

121 Byler (n 117) para 17.

122 D Shang et al., ‘Face and Lip-Reading Authentication System Based on Android Smart Phones’ (2019) IEEE <https://bit.ly/2IC69z4>.

123 WK Zhang and MJ Kang, ‘Factors Affecting the Use of Facial-Recognition Payment: An Example of Chinese Consumers’ (2019) IEEE <https://bit.ly/2TFDfo9>.

124 T Yu et al., ‘AI-based Targeted Advertising System’ (2019) 13 Indonesian Journal of Electrical Engineering and Computer Science (February) 787.

125 A Holmes, ‘Microsoft Funded an Israeli Facial Recognition Startup Whose Tech Is Reportedly Being Used to Secretly Surveil Palestinians’ Business Insider (28 October 2019) <https://bit.ly/2PZLYPB>.

126 T Maddox, ‘PatScan Platform Detects Hidden Weapons, Chemicals, and Bombs’ TechRepublic (10 January 2020) <https://tek.io/2IdOFZB>.

127 Patriot One Technologies (n 75).

128 ibid.

129 ibid.

130 ibid.

131 ibid.

132 Nanalyze, ‘Watch for These 8 AI Startups Doing Computer Vision’ (Nanalyze, 13 March 2018) <https://bit.ly/2UItO7W>.

133 P Li and C Cadell, ‘At Beijing Security Fair: An Arms Race for Surveillance Tech’ Reuters (28 May 2018) <https://reut.rs/2RCJPuJ>.

134 N Eddy, ‘Google AI Platform Aids Oncologists in Breast Cancer Screenings’ HealthcareITNews (7 January 2020) <https://bit.ly/2tZL7H1>.

135 A Esteva et al., ‘Dermatologist-Level Classification of Skin Cancer with Deep Neural Networks’ (2017) 542 Nature 115.

137 SM McKinney et al., ‘International Evaluation of an AI System for Breast Cancer Screening’ (2020) 577 Nature 89, 89.

138 ibid.

139 ibid 92.

140 T Hu, ‘China AI Startup Malong Technologies Wins WebVision Challenge’ PR Newswire (27 July 2017) para 5 <https://prn.to/2TCozpS>.

141 ibid, para 6.

142 Airsoc (n 84) para 11.

143 Additional Protocol I (n 4) art 52.

144 Additional Protocol I (n 4) art 41(2).

145 Reeves and Wallace (n 105) 386.

146 Additional Protocol I (n 4) art 50(1).

147 ibid, art 50(3).

148 Geneva Convention III (n 91) art 4A(6).

149 N Melzer, ‘The Principle of Distinction between Civilians and Combatants’ in A Clapham and P Gaeta (eds), The Oxford Handbook of International Law in Armed Conflict (Oxford University Press 2014) 298.

150 Additional Protocol I (n 4) art 51(3).

151 Additional Protocol II (n 5) art 13(3).

152 N Melzer, Interpretive Guidance on the Notion of Direct Participation in Hostilities under International Humanitarian Law (ICRC 2009) 47.

153 ibid 53.

154 ibid 58–64.

155 Prosecutor v Pavle Strugar (Appeal Judgment), ICTY-01-42 (17 July 2008).

156 ibid, para 177.

157 N Bostrom, Superintelligence: Paths, Dangers, Strategies (Oxford University Press 2014).

158 Tegmark (n 39).

159 S Kriegman et al., ‘A Scalable Pipeline for Designing Reconfigurable Organisms’ (2020) Proceedings of the National Academy of Sciences 5 <https://bit.ly/36VCuf1>.

160 United Kingdom (n 30) para 4.

161 N Sharkey, ‘The Evitability of Autonomous Robot Warfare’ (2013) 94 International Review of the Red Cross (New Technologies and Warfare) 787.

162 S Shead ‘Researchers: Are We on the Cusp of an “AI Winter”?’ BBC News Online (12 January 2020) <https://bbc.in/38jFFgT>.

163 VC Muller and N Bostrom, ‘Future Progress in Artificial Intelligence: A Survey of Expert Opinion’ in VC Muller (ed), Fundamental Issues of Artificial Intelligence (Springer 2016).

164 Walsh, T, 2062: The World that AI Made (La Trobe University Press 2018)Google Scholar.

165 Mnih et al. (n 58) 8.

166 The Physics arXiv Blog (n 55) para 11.

167 ibid, para 16.

168 ibid, para 6.

169 ibid, para 6.

170 Winter (n 7).

171 The Physics arXiv Blog (n 55) para 7.

172 E Gibney, ‘Google AI Algorithm Masters Ancient Game of Go’ (2016) 529 Nature 445, 445.

173 ibid 446.

174 ibid.

175 Agrointelli, ‘Our Company’ (Agrointelli) <https://bit.ly/3arI8qJ>.

176 Agrointelli, ‘RoboWeedMaPS’ (Agrointelli) <https://bit.ly/3dAkMkZ>.

177 ibid, para 1.

178 Agrointelli (n 176).

179 RN Jorgensen, ‘RoboWeedMaPS: How Deep Learning Can Help Farmers Get Rid of Weeds’ Aarhus University Department of Engineering (14 January 2019) <https://bit.ly/30aI4rm>.

180 McKinney et al. (n 137) 89.

181 ibid 96.

182 ibid 91.

183 Airbus (n 72) para 6.