Paper Type

Research-in-Progress Paper

Description

Remotely piloted aircrafts (RPAs or "drones") have become important tools in military surveillance and combat, border protection, police and disaster management. In particular, the use of weaponized RPAs has led to a discussion on the ethical, strategic and legal implications of using such systems in warfare. In this context, studies suggest that RPA pilots experience similar exposure to post-traumatic stress, depression and anxiety disorders compared to fighter pilots, although the flight and combat experiences are completely different. In order to investigate this phenomenon, we created an experiment that intends to measure the "moral stress" RPA pilots may experience when the operation of such systems leads to human casualties. "Moral stress" refers to the possibility that deciding upon moral dilemmas may not only cause physiological stress, but may also lead to (unconscious) changes in the evaluation of valus and reasons that are relevant to problem solving. The experiment includes an RPA simulation based on a game engine and novel measurement tools to assess moral reasoning. In this contribution, we outline the design of the experiment and the results of pretests that demonstrate the sensitivity of our measures. We close by arguing for the need of such studies to better understand novel forms of human-computer interaction.

Share

COinS
 

MEASURING THE MORAL IMPACT OF OPERATING "DRONES" ON PILOTS IN COMBAT, DISASTER MANAGEMENT AND SURVEILLANCE

Remotely piloted aircrafts (RPAs or "drones") have become important tools in military surveillance and combat, border protection, police and disaster management. In particular, the use of weaponized RPAs has led to a discussion on the ethical, strategic and legal implications of using such systems in warfare. In this context, studies suggest that RPA pilots experience similar exposure to post-traumatic stress, depression and anxiety disorders compared to fighter pilots, although the flight and combat experiences are completely different. In order to investigate this phenomenon, we created an experiment that intends to measure the "moral stress" RPA pilots may experience when the operation of such systems leads to human casualties. "Moral stress" refers to the possibility that deciding upon moral dilemmas may not only cause physiological stress, but may also lead to (unconscious) changes in the evaluation of valus and reasons that are relevant to problem solving. The experiment includes an RPA simulation based on a game engine and novel measurement tools to assess moral reasoning. In this contribution, we outline the design of the experiment and the results of pretests that demonstrate the sensitivity of our measures. We close by arguing for the need of such studies to better understand novel forms of human-computer interaction.