Toward Intuitive Interaction: A Cognitive Workflow Analysis of Human-Robot Interaction in Extended Reality Interfaces
Open Access
Article
Conference Proceedings
Authors: Pattaraorn Yu, Arisara Jiamsanguanwong, Gim Song Soh
Abstract: Extended-Reality (XR) technologies promise to enhance human–robot interaction (HRI) by offering intuitive spatial interfaces and immersive input. However, traditional evaluation methods, such as task completion time, error rates, or NASA-TLX often obscure where cognitive and physical demands arise or are reduced within the interaction process. This study conducts a cognitive workflow analysis of XR interfaces by integrating established methodologies: Goal-Directed Task Analysis (GDTA), Norman’s Seven Stages of Action, and Applied Cognitive Task Analysis (ACTA). These methods collectively trace how interface design affects user cognition, from mission goals to task-level interactions, revealing specific gulfs of execution and evaluation. We apply this approach to compare two XR interface types: a grid-based menu and spatial affordance-based pop-ups, within an emergency-response scenario using Microsoft HoloLens 2. The analysis uncovers hidden cognitive challenges, such as inefficient visual search and occlusion issues, often missed by conventional metrics. The findings offer XR designers actionable insights into usability challenges and demonstrate how cognitive analysis can guide more intuitive interface development.
Keywords: Human-Robot Interaction (HRI), Multi-Robot System (MRS), Cognitive Task Analysis (CTA), Extended Reality (XR)
DOI: 10.54941/ahfe1006263
Cite this paper:
Downloads
0
Visits
13