scholarly journals Brain Symmetry Analysis during the Use of a BCI Based on Motor Imagery for the Control of a Lower-Limb Exoskeleton

Symmetry ◽  
2021 ◽  
Vol 13 (9) ◽  
pp. 1746
Author(s):  
Laura Ferrero ◽  
Mario Ortiz ◽  
Vicente Quiles ◽  
Eduardo Iáñez ◽  
José A. Flores ◽  
...  

Brain–Computer Interfaces (BCI) are systems that allow external devices to be controlled by means of brain activity. There are different such technologies, and electroencephalography (EEG) is an example. One of the most common EEG control methods is based on detecting changes in sensorimotor rhythms (SMRs) during motor imagery (MI). The aim of this study was to assess the laterality of cortical function when performing MI of the lower limb. Brain signals from five subjects were analyzed in two conditions, during exoskeleton-assisted gait and while static. Three different EEG electrode configurations were evaluated: covering both hemispheres, covering the non-dominant hemisphere and covering the dominant hemisphere. In addition, the evolution of performance and laterality with practice was assessed. Although sightly superior results were achieved with information from all electrodes, differences between electrode configurations were not statistically significant. Regarding the evolution during the experimental sessions, the performance of the BCI generally evolved positively the higher the experience was.

2021 ◽  
Vol 11 (24) ◽  
pp. 11876
Author(s):  
Catalin Dumitrescu ◽  
Ilona-Madalina Costea ◽  
Augustin Semenescu

In recent years, the control of devices “by the power of the mind” has become a very controversial topic but has also been very well researched in the field of state-of-the-art gadgets, such as smartphones, laptops, tablets and even smart TVs, and also in medicine, to be used by people with disabilities for whom these technologies may be the only way to communicate with the outside world. It is well known that BCI control is a skill and can be improved through practice and training. This paper aims to improve and diversify signal processing methods for the implementation of a brain-computer interface (BCI) based on neurological phenomena recorded during motor tasks using motor imagery (MI). The aim of the research is to extract, select and classify the characteristics of electroencephalogram (EEG) signals, which are based on sensorimotor rhythms, for the implementation of BCI systems. This article investigates systems based on brain-computer interfaces, especially those that use the electroencephalogram as a method of acquisition of MI tasks. The purpose of this article is to allow users to manipulate quadcopter virtual structures (external, robotic objects) simply through brain activity, correlated with certain mental tasks using undecimal transformation (UWT) to reduce noise, Independent Component Analysis (ICA) together with determination coefficient (r2) and, for classification, a hybrid neural network consisting of Radial Basis Functions (RBF) and a multilayer perceptron–recurrent network (MLP–RNN), obtaining a classification accuracy of 95.5%. Following the tests performed, it can be stated that the use of biopotentials in human–computer interfaces is a viable method for applications in the field of BCI. The results presented show that BCI training can produce a rapid change in behavioral performance and cognitive properties. If more than one training session is used, the results may be beneficial for increasing poor cognitive performance. To achieve this goal, three steps were taken: understanding the functioning of BCI systems and the neurological phenomena involved; acquiring EEG signals based on sensorimotor rhythms recorded during MI tasks; applying and optimizing extraction methods, selecting and classifying characteristics using neuronal networks.


2021 ◽  
pp. 53-58
Author(s):  
L. Ferrero ◽  
V. Quiles ◽  
M. Ortiz ◽  
E. Iáñez ◽  
A. Navarro-Arcas ◽  
...  

Author(s):  
Rohit Bhat ◽  
Akshay Deshpande ◽  
Rahul Rai ◽  
Ehsan Tarkesh Esfahani

The aim of this paper is to explore a new multimodal Computer Aided Design (CAD) platform based on brain-computer interfaces and touch based systems. The paper describes experiments and algorithms for manipulating geometrical objects in CAD systems using touch-based gestures and movement imagery detected though brain waves. Gestures associated with touch based systems are subjected to ambiguity since they are two dimensional in nature. Brain signals are considered here as the main source to resolve these ambiguities. The brainwaves are recorded in terms of electroencephalogram (EEG) signals. Users wear a neuroheadset and try to move and rotate a target object on a touch screen. As they perform these actions, the EEG headset collects brain activity from 14 locations on the scalp. The data is analyzed in the time-frequency domain to detect the desynchronizations of certain frequency bands (3–7Hz, 8–13 Hz, 14–20Hz 21–29Hz and 30–50Hz) in the temporal cortex as an indication of motor imagery.


e-Neuroforum ◽  
2015 ◽  
Vol 21 (4) ◽  
Author(s):  
Niels Birbaumer ◽  
Ujwal Chaudhary

AbstractBrain-computer interfaces (BCI) use neuroelectric and metabolic brain activity to activate peripheral devices and computers without mediation of the motor system. In order to activate the BCI patients have to learn a certain amount of brain control. Self-regulation of brain activity was found to follow the principles of skill learning and instrumental conditioning. This review focuses on the clinical application of brain-computer interfaces in paralyzed patients with locked-in syndrome and completely locked-in syndrome (CLIS). It was shown that electroencephalogram (EEG)-based brain-computer interfaces allow selection of letters and words in a computer menu with different types of EEG signals. However, in patients with CLIS without any muscular control, particularly of eye movements, classical EEG-based brain-computer interfaces were not successful. Even after implantation of electrodes in the human brain, CLIS patients were unable to communicate. We developed a theoretical model explaining this fundamental deficit in instrumental learning of brain control and voluntary communication: patients in complete paralysis extinguish goal-directed responseoriented thinking and intentions. Therefore, a reflexive classical conditioning procedure was developed and metabolic brain signals measured with near infrared spectroscopy were used in CLIS patients to answer simple questions with a “yes” or “no”-brain response. The data collected so far are promising and show that for the first time CLIS patients communicate with such a BCI system using metabolic brain signals and simple reflexive learning tasks. Finally, brain machine interfaces and rehabilitation in chronic stroke are described demonstrating in chronic stroke patients without any residual upper limb movement a surprising recovery of motor function on the motor level as well as on the brain level. After extensive combined BCI training with behaviorally oriented physiotherapy, significant improvement in motor function was shown in this previously intractable paralysis. In conclusion, clinical application of brain machine interfaces in well-defined and circumscribed neurological disorders have demonstrated surprisingly positive effects. The application of BCIs to psychiatric and clinical-psychological problems, however, at present did not result in substantial improvement of complex behavioral disorders.


2019 ◽  
Vol 4 (6) ◽  
pp. 1622-1636
Author(s):  
Kevin M. Pitt ◽  
Jonathan S. Brumberg ◽  
Jeremy D. Burnison ◽  
Jyutika Mehta ◽  
Juhi Kidwai

Purpose Brain–computer interface (BCI) techniques may provide computer access for individuals with severe physical impairments. However, the relatively hidden nature of BCI control obscures how BCI systems work behind the scenes, making it difficult to understand “how” electroencephalography (EEG) records the BCI-related brain signals, “what” brain signals are recorded by EEG, and “why” these signals are targeted for BCI control. Furthermore, in the field of speech-language-hearing, signals targeted for BCI application have been of primary interest to clinicians and researchers in the area of augmentative and alternative communication (AAC). However, signals utilized for BCI control reflect sensory, cognitive, and motor processes, which are of interest to a range of related disciplines, including speech science. Method This tutorial was developed by a multidisciplinary team emphasizing primary and secondary BCI-AAC–related signals of interest to speech-language-hearing. Results An overview of BCI-AAC–related signals are provided discussing (a) “how” BCI signals are recorded via EEG; (b) “what” signals are targeted for noninvasive BCI control, including the P300, sensorimotor rhythms, steady-state evoked potentials, contingent negative variation, and the N400; and (c) “why” these signals are targeted. During tutorial creation, attention was given to help support EEG and BCI understanding for those without an engineering background. Conclusion Tutorials highlighting how BCI-AAC signals are elicited and recorded can help increase interest and familiarity with EEG and BCI techniques and provide a framework for understanding key principles behind BCI-AAC design and implementation.


2013 ◽  
pp. 1549-1570
Author(s):  
Carmen Vidaurre ◽  
Andrea Kübler ◽  
Michael Tangermann ◽  
Klaus-Robert Müller ◽  
José del R. Millán

There is growing interest in the use of brain signals for communication and operation of devices, in particular, for physically disabled people. Brain states can be detected and translated into actions such as selecting a letter from a virtual keyboard, playing a video game, or moving a robot arm. This chapter presents what is known about the effects of visual stimuli on brain activity and introduces means of monitoring brain activity. Possibilities of brain-controlled interfaces, either with the brain signals as the sole input or in combination with the measured point of gaze, are discussed.


2011 ◽  
Vol 23 (3) ◽  
pp. 791-816 ◽  
Author(s):  
Carmen Vidaurre ◽  
Claudia Sannelli ◽  
Klaus-Robert Müller ◽  
Benjamin Blankertz

Brain-computer interfaces (BCIs) allow users to control a computer application by brain activity as acquired (e.g., by EEG). In our classic machine learning approach to BCIs, the participants undertake a calibration measurement without feedback to acquire data to train the BCI system. After the training, the user can control a BCI and improve the operation through some type of feedback. However, not all BCI users are able to perform sufficiently well during feedback operation. In fact, a nonnegligible portion of participants (estimated 15%–30%) cannot control the system (a BCI illiteracy problem, generic to all motor-imagery-based BCIs). We hypothesize that one main difficulty for a BCI user is the transition from offline calibration to online feedback. In this work, we investigate adaptive machine learning methods to eliminate offline calibration and analyze the performance of 11 volunteers in a BCI based on the modulation of sensorimotor rhythms. We present an adaptation scheme that individually guides the user. It starts with a subject-independent classifier that evolves to a subject-optimized state-of-the-art classifier within one session while the user interacts continuously. These initial runs use supervised techniques for robust coadaptive learning of user and machine. Subsequent runs use unsupervised adaptation to track the features’ drift during the session and provide an unbiased measure of BCI performance. Using this approach, without any offline calibration, six users, including one novice, obtained good performance after 3 to 6 minutes of adaptation. More important, this novel guided learning also allows participants with BCI illiteracy to gain significant control with the BCI in less than 60 minutes. In addition, one volunteer without sensorimotor idle rhythm peak at the beginning of the BCI experiment developed it during the course of the session and used voluntary modulation of its amplitude to control the feedback application.


Sensors ◽  
2020 ◽  
Vol 20 (24) ◽  
pp. 7309
Author(s):  
Junhyuk Choi ◽  
Keun Tae Kim ◽  
Ji Hyeok Jeong ◽  
Laehyun Kim ◽  
Song Joo Lee ◽  
...  

This study aimed to develop an intuitive gait-related motor imagery (MI)-based hybrid brain-computer interface (BCI) controller for a lower-limb exoskeleton and investigate the feasibility of the controller under a practical scenario including stand-up, gait-forward, and sit-down. A filter bank common spatial pattern (FBCSP) and mutual information-based best individual feature (MIBIF) selection were used in the study to decode MI electroencephalogram (EEG) signals and extract a feature matrix as an input to the support vector machine (SVM) classifier. A successive eye-blink switch was sequentially combined with the EEG decoder in operating the lower-limb exoskeleton. Ten subjects demonstrated more than 80% accuracy in both offline (training) and online. All subjects successfully completed a gait task by wearing the lower-limb exoskeleton through the developed real-time BCI controller. The BCI controller achieved a time ratio of 1.45 compared with a manual smartwatch controller. The developed system can potentially be benefit people with neurological disorders who may have difficulties operating manual control.


2021 ◽  
Vol 11 (9) ◽  
pp. 4106
Author(s):  
Laura Ferrero ◽  
Vicente Quiles ◽  
Mario Ortiz ◽  
Eduardo Iáñez ◽  
José M. Azorín

Lower-limb robotic exoskeletons are wearable devices that can be beneficial for people with lower-extremity motor impairment because they can be valuable in rehabilitation or assistance. These devices can be controlled mentally by means of brain–machine interfaces (BMI). The aim of the present study was the design of a BMI based on motor imagery (MI) to control the gait of a lower-limb exoskeleton. The evaluation is carried out with able-bodied subjects as a preliminary study since potential users are people with motor limitations. The proposed control works as a state machine, i.e., the decoding algorithm is different to start (standing still) and to stop (walking). The BMI combines two different paradigms for reducing the false triggering rate (when the BMI identifies irrelevant brain tasks as MI), one based on motor imagery and another one based on the attention to the gait of the user. Research was divided into two parts. First, during the training phase, results showed an average accuracy of 68.44 ± 8.46% for the MI paradigm and 65.45 ± 5.53% for the attention paradigm. Then, during the test phase, the exoskeleton was controlled by the BMI and the average performance was 64.50 ± 10.66%, with very few false positives. Participants completed various sessions and there was a significant improvement over time. These results indicate that, after several sessions, the developed system may be employed for controlling a lower-limb exoskeleton, which could benefit people with motor impairment as an assistance device and/or as a therapeutic approach with very limited false activations.


Sign in / Sign up

Export Citation Format

Share Document