Evaluating Automated Gaze Mapping Across Laboratory and Field Study Settings
Open Access
Article
Conference Proceedings
Authors: Celina Vetter, Rebecca Nauli, Ruth Häusler Hermann, Maarten Uijt De Haag
Abstract: Eye Tracking (ET) is used in different industries to understand human visual attention, cognitive load, and decision-making processes by capturing where, and for how long a person focuses their attention. A major challenge in processing ET data is mapping gaze points to dynamic Areas of Interest (AOIs) while accounting for data variability and head movements. The purpose of this research is to compare and evaluate methods for automatic gaze-to-AOI mapping to improve efficiency and accuracy in gaze analysis. An ArUco-marker based analytical software was developed to automate gaze mapping to AOIs based on three methods for data collected in laboratory and field settings. The methods are (1) marker-based mapping, and homography-based mapping using either (2) manual-defined reference points or (3) feature detection. All three methods are compared against two baselines: manual gaze mapping and assisted mapping using a commercial software. Overall, the results show that the performance of automatic mapping methods is highly dependent on the setting and AOI configurations. Under laboratory conditions, all automated gaze mapping methods achieved accuracy (97%) and F1-scores (97%) comparable to manual mapping. In complex field study settings, the performance varied, and accuracy is reduced ranging from 14% to 77% depending on the setting due to varying conditions regarding sudden transition in lighting and real-time dynamics of the situation. The marker-based method demonstrated consistently high accuracy across all settings. Depending on the environment, the manual reference point-based homography mapping occasionally demonstrated superior accuracy. Manual mapping remained the most accurate in field study conditions but required significantly more processing time. Future work will focus on enhancing the method’s robustness in dynamic environments based on adaptive reference image selection. This will increase accuracy of gaze-to-AOI mapping and set the stage for real-time monitoring of visual attention in complex, safety-critical contexts. This advancement will foster the development of resilient, adaptive human-machine systems that dynamically respond to operator conditions, significantly reducing the likelihood of human error and enhancing overall performance.
Keywords: Eye Tracking Data Processing, Gaze Point Mapping, Dynamic Areas of Interest, Automatic Analysis Methods
DOI: 10.54941/ahfe1006722
Cite this paper:
Downloads
8
Visits
40