Multi-camera Face Tracking for Estimating User’s Discomfort with Automated Vehicle Operations

Open Access
Article
Conference Proceedings
Authors: Matthias BeggiatoNadine RauhJosef Krems

Abstract: Face tracking as innovative and unobtrusive sensor technology offers new possibilities for driver state monitoring regarding discomfort in automated driving. To explore the potential of automated facial expression analysis, video data of two driving simulator studies were analyzed using the Visage facial features and analysis software. A gender-balanced sample of 81 participants between 24 and 84 years took part in the studies. All participants were driven in highly automated mode on the same standardized track, consisting of three close approach situations to a truck driving ahead. By pressing the lever of a handset control, all participants could report perceived discomfort continuously. Tracking values for 23 facial action units were extracted from multiple video camera streams, z-transformed and averaged from 10 s before pressing the handset control until 10 s after to show changes over time. Results showed situation-related pressing and stretching of the lips, a push-back movement of the head, raising of inner brows and upper lids as well as reduced eye closure. These patterns could be interpreted as visual attention, tension and surprise. Overall, there is potential of facial expression analysis for contributing information about users’ comfort with automated vehicle operations. However, effects became manifest on aggregated data level; obtaining stable and reliable results on individual level remains a challenging task.

Keywords: Automated Driving, Face Tracking, Facial Action Units, Facial Expressions, Comfort, Driver-Vehicle Team, Driver State Monitoring

DOI: 10.54941/ahfe1001104

Cite this paper:

Downloads
571
Visits
543
Download