Comparison and Evaluation of Visual and Auditory Signals According to Different Levels of Information for Multimodal eHMI Design
Open Access
Article
Conference Proceedings
Authors: Dokshin Lim, Seong Geun Ahn
Abstract: Autonomous vehicles are rapidly evolving, but it is obvious that they need to be partially fused and adapted to the current road conditions. In this process, the newly proposed eHMI to replace human drivers presents various possibilities beyond simple communication. In this study, we confirmed that eHMI contributes to road safety and conducted a comparison and evaluation of complex combinations of visual and auditory signals. As a scenario for the experiment, a pedestrian accident with the driver’s limited view was set. This was produced in a 360-degree VR video so that participants could be more immersed in the risk of accident situations and eHMI signals. Participants conducted paired comparison, evaluations of intuitiveness and warning, and open discussion was also recorded in the process. Rather than providing an excessive amount of information via both auditory and visual channels, a combination of visual and auditory signals that complemented each other performed better from the pedestrians’ point of view.
Keywords: Autonomous Vehicle, Multi-modal, Sound, eHMI, 360-degree VR video
DOI: 10.54941/ahfe1002490
Cite this paper:
Downloads
273
Visits
399