Construction of a VR Multimodal Dataset for Stress Recognition
Open Access
Article
Conference Proceedings
Authors: Qichao Zhao, Jianming Yang, Qingju Wang, Ying Gao, Bing Zhang, Qian Zhou, Ping Wu, Han Li
Abstract: Accurate identification of individuals' stress states is critical for optimizing intervention strategies and enhancing safety performance in intelligent human-machine interaction systems and high-risk operational environments. Virtual Reality (VR) technology offers a novel paradigm for inducing controllable and ecologically valid stress through highly immersive scenarios. The development of high-quality multimodal stress datasets represents an urgent requirement to advance emotion computing and practical applications of intelligent human-computer interaction. This study presents the creation of a VR-based multimodal stress dataset. The experimental protocol comprised four tasks: ground walking, elevated platform 1 walking, elevated platform 2 walking, and jumping platform tasks. Physiological data including electroencephalogram (EEG), photoplethysmography (PPG), electrodermal activity (EDA), and eye tracking data were collected across all tasks, along with subjective stress ratings in different scenarios. Data from 30 participants were acquired. A binary classification was performed between two representative scenarios: ground walking (low-stress state) and jumping platform (high-stress state). Following feature extraction from EEG and PPG signals, classification models (decision tree, random forest, Bagging, and AdaBoost) were implemented. The random forest classifier achieved optimal performance, yielding a cross-subject five-fold cross-validation accuracy of 0.8360 ± 0.0162 and F1 score of 0.8140 ± 0.0301 for distinguishing between low-stress and high-stress states. This dataset provides essential data support for real-time stress recognition, with potential applications in intelligent human-computer interaction and medical rehabilitation. Data-driven interventions based on this resource could significantly enhance health outcomes and work efficiency across multiple domains.
Keywords: Virtual Reality (VR), Stress Recognition, Multimodal Dataset, Physiological Signals, Machine Learning
DOI: 10.54941/ahfe1006351
Cite this paper:
Downloads
36
Visits
35