Cognitive Workload and Interface Performance: A Neuroergonomic Comparison of VR, AR, and Traditional Drone Control Systems
Open Access
Article
Conference Proceedings
Authors: Suvipra Singh
Abstract: As small Unmanned Aerial Systems (sUAS) become vital tools in sectors such as disaster response, inspection, and precision operations, understanding how interface modality shapes pilot cognition is critical. This study compares Virtual Reality (VR), Augmented Reality (AR), and Traditional (physical controller) interfaces under simulated conditions to isolate neurocognitive differences among novice, intermediate, and expert drone pilots. Real-time electroencephalography (EEG) recorded theta, alpha, and beta wave activity as participants completed standardized flight tasks including spatial navigation, obstacle avoidance, altitude stabilization, and precision landing. EEG metrics captured continuous variations in cognitive workload, attentional engagement, and sensorimotor regulation across skill levels. Results indicate that VR induced elevated beta activity linked to sensory integration demands, AR maintained balanced alpha–theta dynamics reflecting optimal engagement, and Traditional control minimized workload through procedural fluency. These findings contribute neuroergonomic insights for developing skill-adaptive, cognitively optimized sUAS interfaces that enhance performance, learning, and operator well-being.
Keywords: Neuroergonomics, Human–Computer Interaction, Virtual and Augmented Reality, Drone Control Systems, Cognitive Workload, Interface Modality
DOI: 10.54941/ahfe1006902
Cite this paper:
Downloads
10
Visits
83


AHFE Open Access