52 2 CHAPTER 2 Figure 5: Elements of accountability concept (as in: Bovens, 2007) Accountability gaps Many scholars point to accountability gaps that may occur in the deployment of Autonomous Weapon Systems. However, what the authors below refer to as accountability is what we in this research, based on the work of Van de Poel (2011), call ‘blameworthiness’ or ‘culpability’. Those notions are related to backward-looking responsibility, but not similar to the concept of accountability as we employ in this research. In this section we identify these different uses of the term accountability and relate them to the work of Van de Poel (2011). Asaro (2016) argues that the use of emerging technologies, including Autonomous Weapon Systems, with weak or without norms can lead to limited or easily avoidable responsibility and accountability for states and individuals. Sparrow (2016), building on the work of Matthias (2004) and Roff (2013), states that the use of an Autonomous Weapons System might risk an ‘responsibility gap’ and it could be problematic to attribute responsibility for actions taken by Autonomous Weapon Systems to operators. Galliott (2015) also mentions the responsibility gap put forward by Sparrow and argues that shifting to forward-looking responsibility, instead of only backward-looking responsibility, and a functional sense of responsibility to include institutional agents and the human role in engineering the system, might be a solution to avoid this gap. Crootof (2015) also discusses the accountability gap and notes that with the use of Autonomous Weapon Systems serious violations of international humanitarian law may be committed resulting in a lack of criminal liability, which is a form of backward-looking responsibility but not accountability in a strict sense as meant by Van de Poel (2011), for people, including the deployer, programmer, manufacturer and commander, or the weapon system itself. According to Horowitz and Scharre (2015) the potential of an ‘accountability gap’ is the main motivation to implement the principle of Meaningful Human Control. If an Autonomous Weapon System malfunctions and strikes the wrong target it is possible that no human is responsible for the error of the weapon.
RkJQdWJsaXNoZXIy MjY0ODMw