Proefschrift

18 CHAPTER 1 1 Autonomous Weapon Systems are weapons systems equipped with Artificial Intelligence (AI). They are increasingly deployed on the battlefield (Dawes, 2023; Heather M. Roff, 2016; Tucker, 2023). Autonomous systems can have many benefits in the military domain, for example in the Ukraine where the Fortem DroneHunter F700, which is an autonomous drone with radar control and artificial intelligence, is deployed to shield the country’s energy facilities from Russian attacks (Soldak, 2023). Yet the nature of Autonomous Weapon Systems might also lead to security risks and unpredictable activities as Non-Governmental Organisations (NGO’s) Human Rights Watch (Human Rights Watch, 2023) and the International Committee of the Red Cross (ICRC, 2023) indicate in their statements to The Group of Governmental Experts (GGE) on emerging technologies in the area of Lethal Autonomous Weapons Systems (LAWS) of the Convention on Certain Conventional Weapons (CCW) of the United Nations. Next to security risks and unpredictable activities, the impact on human dignity and the emergence of an accountability gap are mentioned as concerns with the use of Autonomous Weapon Systems. The alleged offence to human dignity entailed in delegating life-or-death decision-making to a machine is linked to the value of human life. The Campaign to Stop Killer Robots (2023) states on their website that: ‘…a machine should not be allowed to make a decision over life and death.’, because it is lacking human judgement and understanding of the context of its use. The United Nations are also voicing their concerns and state that ‘Autonomous weapons systems that require no meaningful human control should be prohibited, and remotely controlled force should only ever be used with the greatest caution’ (General Assembly United Nations, 2016). At the same time, many scholars express concerns that Autonomous Weapon Systems will lead to an “accountability gap” or “accountability vacuum”; circumstances in which no human can be held accountable for the decisions, actions and effects of Autonomous Weapon Systems (Matthias 2004; Asaro 2012; Asaro 2016; Crootof 2015; Dickinson 2018; Horowitz and Scharre 2015; Wagner 2014; Sparrow 2016; Roff 2013; Galliott 2015). This concern is also reflected in one of the guiding principles for LAWS of the GGE on emerging technologies in the area of LAWS of the CCW of the United Nations: ‘Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines. This should be considered across the entire lifecycle of the weapon system.’ (UN GGE LAWS 2018). Hence, the deployment of Autonomous Weapon Systems on the battlefield without direct human oversight is not only a military revolution according to Kaag and Kaufman (2009), but can also be considered a moral one. As large-scale deployment of AI on the battlefield seems unavoidable (Rosenberg & Markoff, 2016), the research on ethical and moral responsibility is imperative.

RkJQdWJsaXNoZXIy MjY0ODMw