Measuring Human Influential Factors During VR Gaming at Home: Towards Optimized Per-User Gaming Experiences

Open Access
Article
Conference Proceedings
Authors: Marc Antoine MoinnereauTiago Henrique FalkAlcyr Alves De Oliveira

Abstract: It is known that human influential factors (HIFs, e.g., sense of presence/immersion; attention, stress, and engagement levels; fun factors) play a crucial role in the gamer’s perceived immersive media experience [1]. To this end, recent research has explored the use of affective brain-/body-computer interfaces to monitor such factors [2, 3]. Typically, studies have been conducted in laboratory settings and have relied on research-grade neurophysiological sensors. Transferring the obtained knowledge to everyday settings, however, is not straightforward, especially since it requires cumbersome and long preparation times (e.g., placing electroencephalography caps, gel, test impedances) which could be overwhelming for gamers. To overcome this limitation, we have recently developed an instrumented “plug-and-play” virtual reality head-mounted display (termed iHMD) [4] which directly embeds a number of dry ExG sensors (electroencephalography, EEG; electrocardiography, ECG; electromyography, EMG; and electrooculography, EoG) into the HMD. A portable bioamplifier is used to collect, stream, and/or store the biosignals in real-time. Moreover, a software suite has been developed to automatically measure signal quality [5], enhance the biosignals [6, 7, 8], infer breathing rate from the ECG [9], and extract relevant HIFs from the post-processed signals [3, 10, 11]. More recently, we have also developed companion software to allow for use and monitoring of the device at the gamer’s home with minimal experimental supervision, hence exploring its potential use truly “in the wild”. The iHMD, VR controllers, and a laptop, along with a copy of the Half-Life: Alyx videogame, were dropped off at the homes of 10 gamers who consented to participate in the study. All public health COVID-19 protocols were followed, including sanitizing the iHMD in a UV-C light chamber and with sanitizing wipes 48h prior to dropping the equipment off. Instructions on how to set up the equipment and the game, as well as a google form with a multi-part questionnaire [12] to be answered after the game were provided via videoconference. The researcher remained available remotely in case any participant questions arose, but otherwise, interventions were minimal. Participants were asked to play the game for around one hour and none of the participants reported cybersickness. This paper details the obtained results from this study and shows the potential of measuring HIFs from ExG signals collected “in the wild,” as well as their use in remote gaming experience monitoring. In particular, we will show the potential of measuring gamer engagement and sense of presence from the collected signals and their influence on overall experience. The next steps will be to use these signals and inferred HIFs to adjust the game in real-time, thus maximizing the experience for each individual gamer.References[1] Perkis, A., et al, 2020. QUALINET white paper on definitions of immersive media experience (IMEx). arXiv preprint arXiv:2007.07032.[2] Gupta, R., et al, 2016. Using affective BCIs to characterize human influential factors for speech QoE perception modelling. Human-centric Computing and Information Sciences, 6(1):1-19.[3] Clerico, A., et al, 2016, Biometrics and classifier fusion to predict the fun-factor in video gaming. In IEEE Conf Comp Intell and Games (pp. 1-8).[4] Cassani, R., et al 2020. Neural interface instrumented virtual reality headsets: Toward next-generation immersive applications. IEEE SMC Mag, 6(3):20-28.[5] Tobon, D. et al, 2014. MS-QI: A modulation spectrum-based ECG quality index for telehealth applications. IEEE TBE, 63(8):1613-1622.[6] Tobón, D. and Falk, T.H., 2016. Adaptive spectro-temporal filtering for electrocardiogram signal enhancement. IEEE JBHI, 22(2):421-428.[7] dos Santos, E., et al, 2020. Improved motor imagery BCI performance via adaptive modulation filtering and two-stage classification. Biomed Signal Proc Control, Vol. 57.[8] Rosanne, O., et al, 2021. Adaptive filtering for improved EEG-based mental workload assessment of ambulant users. Front. Neurosci, Vol.15.[9] Cassani, R., et al, 2018. Respiration rate estimation from noisy electrocardiograms based on modulation spectral analysis. CMBES Proc., Vol. 41.[10] Tiwari, A. and Falk, T.H., 2021. New Measures of Heart Rate Variability based on Subband Tachogram Complexity and Spectral Characteristics for Improved Stress and Anxiety Monitoring in Highly Ecological Settings. Front Signal Proc, Vol.7.[11] Moinnereau, M.A., 2020, Saccadic Eye Movement Classification Using ExG Sensors Embedded into a Virtual Reality Headset. In IEEE Conf SMC, pp. 3494-3498.[12] Tcha-Tokey, K., et al, 2016. Proposition and Validation of a Questionnaire to Measure the User Experience in Immersive Virtual Environments. Intl J Virtual Reality, 16:33-48.

Keywords: Human Influential Factors, User experience, Virtual reality, Remote experiment

DOI: 10.54941/ahfe1002056

Cite this paper:

Downloads
13
Visits
30
Download..