Mixed Reality Handheld Displays for Robot Control: A Comparative Study
Open Access
Article
Conference Proceedings
Authors: Vera Marie Memmesheimer, Ian Thomas Chuang, Bahram Ravani, Achim Ebert
Abstract: Robotic systems for several applications from healthcare to space explorations are being developed to handle different levels of autonomy – from working independently to working in collaboration with or under control by human operators. To ensure optimal human-robot cooperation, appropriate UIs are needed. In this context, applying Mixed Reality handheld displays (MR-HHDs), an ubiquitous tool for virtually augmenting reality, seems promising. As existing MR-HHD-UIs for robot control employ fatigue-prone and view-obstructing touch input, we propose controlling a robot arm via an enhanced MR-HHD-UI based on peripheral touch and device movement. Our detailed, comparative user study on usability and cognitive load demonstrates that the proposed MR-HHD-UI is a powerful tool for complementing the strengths of robots and humans. In our experiment, the MR-HHD-UI outperformed a Gamepad- and Desktop-UI in terms of temporal and cognitive demands and was rated as the preferred UI.
Keywords: Mixed Reality, Handheld Displays, Human-Robot Interaction
DOI: 10.54941/ahfe1005380
Cite this paper:
Downloads
76
Visits
235