Information Fusion for Driver Distraction Studies Using Eye Tracking Glasses

Open Access
Conference Proceedings
Authors: Lucas Paletta aMichael Schwarz aCaroline Wollendorfer bRoland Perko a

Abstract: Eye tracking research about driver distraction, applied to real world driving tasks, has so far demanded a massive amount of manual intervention, for the annotation of hundreds of hours of head camera videos. We present a novel methodology that enables the automated integration of arbitrary gaze localizations onto a visual object and its local surrounding in order to draw heat maps directly onto the environment. Gaze locations are tracked in video frames of the eye tracking glasses’ head camera, within the regions about the driver’s environment, using optical flow methodology. The high robustness and accuracy of the optical flow based tracking - measured with a residual mean error of ca. 0.3 pixels on sequences, captured and verified in 576 individual trials - enables a fully automated estimation of the driver’s attention processes, for example in the context of roadside objects. We present results from a typical driver distraction study and visualize the performance of fully aggregated human attention behavior.

Keywords: Driver Attention Analysis, Optical Flow, Tracking, Geometric Transformation, Attention Mapping

DOI: 10.54941/ahfe100638

Cite this paper: