Multimodal HCI: a review of computational tools and their relevance to the detection of sexual presence

Open Access
Article
Conference Proceedings
Authors: Clement GalaupLama SéoudPatrice Renaud

Abstract: Cybersexuality, referring to sexual interactions facilitated by or involving sexual technologies, for better or worse, is poised to play an increasingly significant role in people's lives. The psychophysiological states stemming from such interactions with sexual technologies, and especially virtual reality (VR) scenarios, is termed "sexual presence" (SP). To automatically assess such a state may help detect problematic sexual responses, particularly for forensic purposes. This work aims to review the different methods used to analyse and algorithmically evaluate multimodal electroencephalography (EEG) -centric physiological signals through a multimodal human-computer interface (HCI) and to pinpoint those who prove relevant to the detection of (SP).Multimodal HCI are defined as the processing of combined natural modalities with multimedia system or environment. Each modality engages different human capabilities (cognitive, sensory, motion, perceptual). These capabilities, in response to the multimedia environment, can be quantified through psychophysiological signals such as EEG, electrocardiography (ECG), skin conductance, skin temperature, respiration rate, eye gaze, head movements, to name only the most common.While existing surveys have focused on the specific use of EEG to analyse emotions or on the measurement techniques and methods that have been used to record psycho-physiological signals, this work reviews the computational tools, mostly using machine and deep learning, to process, analyse and combine various physiological signals in HCI.Papers published in the last 10 years, combining at least two psycho-physiological signals in an HCI system were collected and reviewed, regardless of the field of application. The focus was mostly on the methodological aspects such as signal synchronization and calibration, fusion approach, model architecture, learning strategy. We put an emphasis on the methods that can be used to detect a subject’s condition in real time. At the light of this review, we can identify a research gap in terms of computational tools for multimodal data classification and prediction.This review will allow us to draw on existing work in other fields of application to address our specific application: to analyse EEG, oculometry and sexual plethysmography (penile for the men and vaginal for the women) signals together, using deep learning, to detect SP in subjects immersed in a VR environment with sexual content.

Keywords: Machine Learning, Multimodal signal analysis, VR, Brain-computer Interface, Sexual Presence

DOI: 10.54941/ahfe1004477

Cite this paper:

Downloads
47
Visits
224
Download