Dora: an AR-HUD interactive system that combines gesture recognition and eye-tracking
Open Access
Article
Conference Proceedings
Authors: Xingguo Zhang, Zinan Chen, Xinyu Zhu, Zhenyu Gu
Abstract: AR-HUD is a popular driving assistance system that can display the image information on the front windshield to ensure driving safety and enhance the driving experience. AR-HUD will deal with more complex non-driving tasks in future highly automated driving scenarios and provide more diverse information to drivers and passengers. To meet the changing needs with new technologies, this paper proposes an interactive gesture and gaze controlled AR-HUD system. Following the user experience design method, the interactive system design was conducted through specific research on driving behaviors and users’ demands in HAD scenarios. A prototype for evaluation was made based on OpenCV, and then a virtual driving scene was built. Through the evaluation of a user study(n=20), the usability of the system was verified.
Keywords: augmented reality, head-up display, gesture recognition, eye-tracking
DOI: 10.54941/ahfe1002086
Cite this paper:
Downloads
372
Visits
646