Proefschrift

53 2 EXTENSIVE LITERATURE REVIEW Alston (2010) describes these gaps as an ‘accountability vacuum’ in his UN report to the Human Rights Council on targeted killings. He defines targeted killings as ‘… the intentional, premeditated and deliberate use of lethal force, by States or their agents acting under colour of law, or by an organized armed group in armed conflict, against a specific individual who is not in the physical custody of the perpetrator.’ (Alston, 2010, p. 26) notes that states failed to disclose: ‘…the procedural and other safeguards in place to ensure that killings are lawful and justified, and the accountability mechanisms that ensure wrongful killings are investigated, prosecuted and punished.’ The reason for this accountability vacuum is that the international community cannot verify the legality of the killing, nor confirm the authenticity of the intelligence used in the targeting process or ensure that the unlawful targeted killing results in impunity. Meloni (2016) argues that the accountability vacuum that Alston described in 2010 has been growing ever since. Cummings (2006a) notes that an erosion of accountability could be caused by the use of computer decision-making systems, because these systems diminish the user’s moral agency and responsibility due to the perception that the automated system is in charge. This could cause operators to cognitively offload responsibility for a decision to a computer which can be viewed as a lack of forward-looking (virtue) responsibility. Which in turn creates a moral buffer, meaning a form of distancing and compartmentalizing of decisions, leading to moral and ethical distance and an erosion of accountability. As we have highlighted above, many authors use different notions when describing accountability gaps. Often, they refer to the notion of accountability, whilst they actually express blameworthiness, culpability or virtue responsibility based on the characterization of Van de Poel (2011). To gain a better understanding of accountability gaps we aim to delineate these gaps in more detail. We identify accountability gaps on three different levels which are based on the layers described by Van den Berg (2015) who distinguishes an engineering, socio-technical and governance perspective to characterize cyberspace. As offloading responsibility of decisions by operators to Autonomous Weapon Systems may lead to erosion of accountability, we identify three possible accountability gaps on three different levels: 1. Technical accountability gap: if the system is designed to be technically inaccessible then human operators cannot give a meaningful account of an action mediated by this machine as information on decisions of the machine cannot be retrieved. 2. Socio-technical accountability gap: human operators do not have sufficient capacity (skill or knowledge) to interpret the behaviour of the machine even though the behaviour is accessible to, for example, an expert. This is linked to the capacity condition for blameworthiness described by Van de Poel (2011). Also, motivation to interpret the behaviour of a system could be lacking if sufficient mechanisms for accountability are not available.

RkJQdWJsaXNoZXIy MjY0ODMw