Multimodal visual, auditory, thermal, and tactile feedback during Brain-Machine Interface use by a spinal cord injury patient

Open Access
Conference Proceedings
Authors: Carla PaisvPedro GasparTânia PoleriDemétrio MatosMaria AzevedoMiguel GagoAndré PerrottaMiguel Vieira

Abstract: Background: Brain-machine interfaces (BMIs) have the potential to replace and expand body functions, but also to induce neuroplasticity. In BMIs that include virtual reality and tactile feedback, it is thought that the underlying mechanism may be partially dependent on immersiveness (i.e., how “realistic” the environment is). It is not known however, if continuously increasing the number of simulation modalities with the goal of creating a more immersive environment may eventually lead to an overload of information and prevent BMI performance. Objective: The aim of this study was to investigate the performance of an spinal cord injury (SCI) patient in a BMI setup based on lower limb motor imagery (i.e., requiring the participant to modulate neural activity to control an avatar in complex virtual reality scenarios, while receiving coherent visual, auditory, tactile, and thermal feedback. Setting: Experiments took place in Physical Medicine and Rehabilitation Department at the Hospital Senhora da Oliveira in Guimarães, Portugal.Methods: The participant was a 52-year-old male with an ASIA complete T4 SCI stabilized. The embodiment experiences were generated through a set up where the subject was required to generate lower limb motor imagery commands, while receiving multimodal feedback, delivered through a virtual reality headset (including goggles and headphones) combined with thermo-tactile feedback sleeves. Sessions were characterized by three different phases: habituation (hand control of avatar movement), acquisition (neural data acquired to train the neural network), and real-time decoding (neural signals decoded in real-time to control the avatar movements). Visual stimuli indicated whether the patient should think about “Walking” or “Not walking”. Results: In the sessions presented here, the participant consistently presented performances above chance levels. In a small number of sessions, where performances were closer to chance levels, the patient reported increased levels of stress due to variables not directly related to the experimental setup or design (work stress, personal problems, etc.). Unexpectedly, in a small number of sessions where the virtual scenario included water, the patient reported feeling his “legs cold”. Conclusions: This study demonstrates that a spinal cord injury patient can control a brain-machine interface combining virtual reality (visual and auditory), tactile, and thermal feedback; supporting the notion that the increased number of feedback modalities did not generate an overload of information.

Keywords: Brain-Machine Interface, Spinal Cord Injury, Virtual Reality

DOI: 10.54941/ahfe100912

Cite this paper: