Proefschrift

101 4 VALUE DELIBERATION At the start of part 1 of the survey and before the first ranking, the participants were asked to list an advantage and disadvantage of each alternative (step 2 in Figure 14). Early warning, safety of soldiers and quick response to threat were mentioned most as advantages. The disadvantages that were mentioned are: late response to threat, automation bias, false positives in identification and dehumanisation of the target. During the value deliberation (step 5 in Figure 14) experts from different backgrounds discussed the context of the scenario and alternatives. Their experience and background determined how they viewed the scenario and alternatives and influenced their answer and ranking. For example, a scientist viewed the values as being part of the design process, for a policymaker it was important that the system provides proper information and that the commander can review this information. One of the participants felt really uncomfortable with the image recognition and raised privacy issues in this ’big brother’ scenario. An expert in computer vision viewed the 99% confidence as too uncertain, not reliable enough and not as an improvement of the system because it is more difficult to understand, but military personnel (nonexpert in computer vision) viewed the addition of 99% confidence as an increase in reliability of the system. Also, military personnel viewed the scenario based on the principles of the Rules Of Engagement and hostile intent which gave context to the scenario to base their answers on. This shows that the difference in experience and background, for example technical expertise or operational experience, influences the answers and ranking of the participants in the value discussion. This can impact design choices that are based on value elicitation so the variety of participant’s background and level of expert knowledge should be taken into account when conducting the value deliberation and making design choices. Another value that was discussed among the participants at the evaluation (step 8 in Figure 14) was trust in the system. One participant stated that compared to human decision-making an AI system can make decisions with fewer errors than human decisionmaking (for example with Autonomous Vehicles). The option in which the Autonomous Weapon System only was used as an early warning system was most acceptable and most trusted. Paraphrasing one of the military participants: ‘It is about understanding the strategy and context of the mission. We need to understand the impact of technology and our presence on the mission. We should think better of applying which technology in which context.’ This shows that not all applications of Autonomous Weapon System in a mission context provide trust to military experts in the decision-making of the Autonomous Weapon System. Human decision-making is in some cases more trusted and preferred. In general, the context in which an Autonomous Weapon System is deployed impacts the meaning and weight people attribute to the values associated with the Autonomous Weapon System.

RkJQdWJsaXNoZXIy MjY0ODMw