scholarly journals Embodiment Is Related to Better Performance on a Brain–Computer Interface in Immersive Virtual Reality: A Pilot Study

Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1204 ◽  
Author(s):  
Julia M. Juliano ◽  
Ryan P. Spicer ◽  
Athanasios Vourvopoulos ◽  
Stephanie Lefebvre ◽  
Kay Jann ◽  
...  

Electroencephalography (EEG)-based brain–computer interfaces (BCIs) for motor rehabilitation aim to “close the loop” between attempted motor commands and sensory feedback by providing supplemental information when individuals successfully achieve specific brain patterns. Existing EEG-based BCIs use various displays to provide feedback, ranging from displays considered more immersive (e.g., head-mounted display virtual reality (HMD-VR)) to displays considered less immersive (e.g., computer screens). However, it is not clear whether more immersive displays improve neurofeedback performance and whether there are individual performance differences in HMD-VR versus screen-based neurofeedback. In this pilot study, we compared neurofeedback performance in HMD-VR versus a computer screen in 12 healthy individuals and examined whether individual differences on two measures (i.e., presence, embodiment) were related to neurofeedback performance in either environment. We found that, while participants’ performance on the BCI was similar between display conditions, the participants’ reported levels of embodiment were significantly different. Specifically, participants experienced higher levels of embodiment in HMD-VR compared to a computer screen. We further found that reported levels of embodiment positively correlated with neurofeedback performance only in HMD-VR. Overall, these preliminary results suggest that embodiment may relate to better performance on EEG-based BCIs and that HMD-VR may increase embodiment compared to computer screens.

2019 ◽  
Author(s):  
Julia M Juliano ◽  
Ryan P Spicer ◽  
Athanasios Vourvopoulos ◽  
Stephanie Lefebvre ◽  
Kay Jann ◽  
...  

AbstractBrain computer interfaces (BCI) can be used to provide individuals with neurofeedback of their own brain activity and train them to learn how to control their brain activity. Neurofeedback-based BCIs used for motor rehabilitation aim to ‘close the loop’ between attempted motor commands and sensory feedback by providing supplemental sensory information when individuals successfully establish specific brain patterns. Existing neurofeedback-based BCIs have used a variety of displays to provide feedback, ranging from devices that provide a more immersive and compelling experience (e.g., head-mounted virtual reality (HMD-VR) or CAVE systems) to devices that are considered less immersive (e.g., computer screens). However, it is not clear whether more immersive systems (i.e., HMD-VR) improve neurofeedback performance compared to computer screens, and whether there are individual performance differences in HMD-VR versus screen-based neurofeedback. In this pilot experiment, we compared neurofeedback performance in HMD-VR versus on a computer screen in twelve healthy individuals. We also examined whether individual differences in presence or embodiment correlated with neurofeedback performance in either environment. Participants were asked to control a virtual right arm by imagining right hand movements. Real-time brain activity indicating motor imagery, which was measured via electroencephalography (EEG) as desynchronized sensorimotor rhythms (SMR; 8-24 Hz) in the left motor cortex, drove the movement of the virtual arm towards (increased SMR desynchronization) or away from (decreased SMR desynchronization) targets. Participants performed two blocks of 30 trials, one for each condition (Screen, HMD-VR), with the order of conditions counterbalanced across participants. After completing each block, participants were asked questions relating to their sense of presence and embodiment in each environment. We found that, while participants’ performance on the neurofeedback-based BCI task was similar between conditions, the participants’ reported levels of embodiment was significantly different between conditions. Specifically, participants experienced higher levels of embodiment in HMD-VR compared to the computer screen. We further found that reported levels of embodiment positively correlated with neurofeedback performance only in the HMD-VR condition. Overall, these preliminary results suggest that embodiment may improve performance on a neurofeedback-based BCI and that HMD-VR may increase embodiment during a neurofeedback-based BCI task compared to a standard computer screen.


2013 ◽  
Vol 4 (1) ◽  
pp. 1 ◽  
Author(s):  
Alessandro Luiz Stamatto Ferreira ◽  
Leonardo Cunha de Miranda ◽  
Erica Esteves Cunha de Miranda ◽  
Sarah Gomes Sakamoto

Brain-Computer Interface (BCI) enables users to interact with a computer only through their brain biological signals, without the need to use muscles. BCI is an emerging research area but it is still relatively immature. However, it is important to reflect on the different aspects of the Human-Computer Interaction (HCI) area related to BCIs, considering that BCIs will be part of interactive systems in the near future. BCIs most attend not only to handicapped users, but also healthy ones, improving interaction for end-users. Virtual Reality (VR) is also an important part of interactive systems, and combined with BCI could greatly enhance user interactions, improving the user experience by using brain signals as input with immersive environments as output. This paper addresses only noninvasive BCIs, since this kind of capture is the only one to not present risk to human health. As contributions of this work we highlight the survey of interactive systems based on BCIs focusing on HCI and VR applications, and a discussion on challenges and future of this subject matter.


2018 ◽  
Vol 9 ◽  
Author(s):  
Ana L. Faria ◽  
Mónica S. Cameirão ◽  
Joana F. Couras ◽  
Joana R. O. Aguiar ◽  
Gabriel M. Costa ◽  
...  

Sensors ◽  
2020 ◽  
Vol 20 (13) ◽  
pp. 3754 ◽  
Author(s):  
Octavio Marin-Pardo ◽  
Christopher M. Laine ◽  
Miranda Rennie ◽  
Kaori L. Ito ◽  
James Finley ◽  
...  

Severe impairment of limb movement after stroke can be challenging to address in the chronic stage of stroke (e.g., greater than 6 months post stroke). Recent evidence suggests that physical therapy can still promote meaningful recovery after this stage, but the required high amount of therapy is difficult to deliver within the scope of standard clinical practice. Digital gaming technologies are now being combined with brain–computer interfaces to motivate engaging and frequent exercise and promote neural recovery. However, the complexity and expense of acquiring brain signals has held back widespread utilization of these rehabilitation systems. Furthermore, for people that have residual muscle activity, electromyography (EMG) might be a simpler and equally effective alternative. In this pilot study, we evaluate the feasibility and efficacy of an EMG-based variant of our REINVENT virtual reality (VR) neurofeedback rehabilitation system to increase volitional muscle activity while reducing unintended co-contractions. We recruited four participants in the chronic stage of stroke recovery, all with severely restricted active wrist movement. They completed seven 1-hour training sessions during which our head-mounted VR system reinforced activation of the wrist extensor muscles without flexor activation. Before and after training, participants underwent a battery of clinical and neuromuscular assessments. We found that training improved scores on standardized clinical assessments, equivalent to those previously reported for brain–computer interfaces. Additionally, training may have induced changes in corticospinal communication, as indexed by an increase in 12–30 Hz corticomuscular coherence and by an improved ability to maintain a constant level of wrist muscle activity. Our data support the feasibility of using muscle–computer interfaces in severe chronic stroke, as well as their potential to promote functional recovery and trigger neural plasticity.


Author(s):  
Athanasios Vourvopoulos ◽  
Octavio Marin Pardo ◽  
Stéphanie Lefebvre ◽  
Meghan Neureither ◽  
David Saldana ◽  
...  

Author(s):  
Woosang Cho ◽  
Nikolaus Sabathiel ◽  
Rupert Ortner ◽  
Alexander Lechner ◽  
Danut C. Irimia ◽  
...  

Conventional therapies do not provide paralyzed patients with closed-loop sensorimotor integration for motor rehabilitation. Paired associative stimulation (PAS) uses brain-computer interface (BCI) technology to monitor patients’ movement imagery in real-time, and utilizes the information to control functional electrical stimulation (FES) and bar feedback for complete sensorimotor closed loop. To realize this approach, we introduce the recoveriX system, a hardware and software platform for PAS. After 10 sessions of recoveriX training, one stroke patient partially regained control of dorsiflexion in her paretic wrist. A controlled group study is planned with a new version of the recoveriX system, which will use a new FES system and an avatar instead of bar feedback.


2021 ◽  
Vol 2 ◽  
Author(s):  
Mariusz P. Furmanek ◽  
Madhur Mangalam ◽  
Kyle Lockwood ◽  
Andrea Smith ◽  
Mathew Yarossi ◽  
...  

Technological advancements and increased access have prompted the adoption of head- mounted display based virtual reality (VR) for neuroscientific research, manual skill training, and neurological rehabilitation. Applications that focus on manual interaction within the virtual environment (VE), especially haptic-free VR, critically depend on virtual hand-object collision detection. Knowledge about how multisensory integration related to hand-object collisions affects perception-action dynamics and reach-to-grasp coordination is needed to enhance the immersiveness of interactive VR. Here, we explored whether and to what extent sensory substitution for haptic feedback of hand-object collision (visual, audio, or audiovisual) and collider size (size of spherical pointers representing the fingertips) influences reach-to-grasp kinematics. In Study 1, visual, auditory, or combined feedback were compared as sensory substitutes to indicate the successful grasp of a virtual object during reach-to-grasp actions. In Study 2, participants reached to grasp virtual objects using spherical colliders of different diameters to test if virtual collider size impacts reach-to-grasp. Our data indicate that collider size but not sensory feedback modality significantly affected the kinematics of grasping. Larger colliders led to a smaller size-normalized peak aperture. We discuss this finding in the context of a possible influence of spherical collider size on the perception of the virtual object’s size and hence effects on motor planning of reach-to-grasp. Critically, reach-to-grasp spatiotemporal coordination patterns were robust to manipulations of sensory feedback modality and spherical collider size, suggesting that the nervous system adjusted the reach (transport) component commensurately to the changes in the grasp (aperture) component. These results have important implications for research, commercial, industrial, and clinical applications of VR.


2021 ◽  
Author(s):  
Tien-Thong Nguyen Do ◽  
Thanh Tung Huynh

AbstractThis study investigates the effects of collaboration on task performance in brain-computer interface (BCI) based on steady-state visually evoked potential (SSVEP). Navigation tasks were performed in a virtual environment under two conditions, e.g., individual performance and team performance. The results showed that average task completion time in the collaborative condition is decreased by 6 percent compared with that of individual performance, which is inline with other studies in collaborative BCI (cBCI) and joint decision-making. Our work is a step forward for the progress in BCI studies that include multi-user interactions.


Sign in / Sign up

Export Citation Format

Share Document