The Impact of Time Constraints on Moral Decision-Making during Human-AI Interaction

Open Access
Article
Conference Proceedings
Authors: Adriana SalatinoArthur PrévelEmilie CasparSalvatore Lo Bue

Abstract: Human beings are nowadays increasingly collaborating with autonomous systems in a wide range of activities. As this collaboration has an impact on human decision-making and behavior, it is essential to advance research on Human-Artificial Intelligent (AI) interactions. AI systems are now even employed to support decision-making in sensitive areas such as medicine or defence and security, which can involve decisions with a moral dimension. Understanding better the consequences of the interaction in those contexts is crucial to ensure that both efficiency in the decisions made and ethical considerations are effectively addressed. While AI can improve the quality and speed of decisions and reduce mental workload, it also carries risks such as complacency, loss of situational awareness and skill decay, mainly because AI remains imperfect and errors may occur. These issues are more pronounced with higher AI autonomy, which has been linked to reduced accuracy and responsibility in human decision-making, especially in moral contexts. Task difficulty, such as time pressure, may exacerbate over-reliance on AI. However, these aspects have not yet been sufficiently explored. The present study aims to investigate whether task difficulty, induced by time pressure, could influence moral decision-making in a military population interacting with AI systems. To this end, we conducted an experiment with an ad hoc task in which participants took on the role of drone operators and were asked to decide whether to launch an attack (or not) based on factors such as the presence of enemies and potential risks to allies, civilians, and infrastructure. Participants completed morally and non-morally challenging scenarios under low (15 seconds) and high (4 seconds) time pressure, with and without AI assistance. We hypothesised that increased time pressure would lead to increased overreliance, which would lead participants to rely more on AI advice and influence moral decision-making.

Keywords: Human-autonomous systems interaction, Human performance, Moral decision-making, Responsibility

DOI: 10.54941/ahfe1006629

Cite this paper:

Downloads
0
Visits
16