scholarly journals Effective Connectivity for Decoding Electroencephalographic Motor Imagery Using a Probabilistic Neural Network

Sensors ◽  
2021 ◽  
Vol 21 (19) ◽  
pp. 6570
Author(s):  
Muhammad Ahsan Awais ◽  
Mohd Zuki Yusoff ◽  
Danish M. Khan ◽  
Norashikin Yahya ◽  
Nidal Kamel ◽  
...  

Motor imagery (MI)-based brain–computer interfaces have gained much attention in the last few years. They provide the ability to control external devices, such as prosthetic arms and wheelchairs, by using brain activities. Several researchers have reported the inter-communication of multiple brain regions during motor tasks, thus making it difficult to isolate one or two brain regions in which motor activities take place. Therefore, a deeper understanding of the brain’s neural patterns is important for BCI in order to provide more useful and insightful features. Thus, brain connectivity provides a promising approach to solving the stated shortcomings by considering inter-channel/region relationships during motor imagination. This study used effective connectivity in the brain in terms of the partial directed coherence (PDC) and directed transfer function (DTF) as intensively unconventional feature sets for motor imagery (MI) classification. MANOVA-based analysis was performed to identify statistically significant connectivity pairs. Furthermore, the study sought to predict MI patterns by using four classification algorithms—an SVM, KNN, decision tree, and probabilistic neural network. The study provides a comparative analysis of all of the classification methods using two-class MI data extracted from the PhysioNet EEG database. The proposed techniques based on a probabilistic neural network (PNN) as a classifier and PDC as a feature set outperformed the other classification and feature extraction techniques with a superior classification accuracy and a lower error rate. The research findings indicate that when the PDC was used as a feature set, the PNN attained the greatest overall average accuracy of 98.65%, whereas the same classifier was used to attain the greatest accuracy of 82.81% with the DTF. This study validates the activation of multiple brain regions during a motor task by achieving better classification outcomes through brain connectivity as compared to conventional features. Since the PDC outperformed the DTF as a feature set with its superior classification accuracy and low error rate, it has great potential for application in MI-based brain–computer interfaces.

Author(s):  
Pasquale Arpaia ◽  
Francesco Donnarumma ◽  
Antonio Esposito ◽  
Marco Parvis

A method for selecting electroencephalographic (EEG) signals in motor imagery-based brain-computer interfaces (MI-BCI) is proposed for enhancing the online interoperability and portability of BCI systems, as well as user comfort. The attempt is also to reduce variability and noise of MI-BCI, which could be affected by a large number of EEG channels. The relation between selected channels and MI-BCI performance is therefore analyzed. The proposed method is able to select acquisition channels common to all subjects, while achieving a performance compatible with the use of all the channels. Results are reported with reference to a standard benchmark dataset, the BCI competition IV dataset 2a. They prove that a performance compatible with the best state-of-the-art approaches can be achieved, while adopting a significantly smaller number of channels, both in two and in four tasks classification. In particular, classification accuracy is about 77–83% in binary classification with down to 6 EEG channels, and above 60% for the four-classes case when 10 channels are employed. This gives a contribution in optimizing the EEG measurement while developing non-invasive and wearable MI-based brain-computer interfaces.


2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Xianghong Zhao ◽  
Jieyu Zhao ◽  
Cong Liu ◽  
Weiming Cai

Motor imagery brain-computer interfaces (BCIs) have demonstrated great potential and attract world-spread attentions. Due to the nonstationary character of the motor imagery signals, costly and boring calibration sessions must be proceeded before use. This prevents them from going into our realistic life. In this paper, the source subject’s data are explored to perform calibration for target subjects. Model trained on source subjects is transferred to work for target subjects, in which the critical problem to handle is the distribution shift. It is found that the performance of classification would be bad when only the marginal distributions of source and target are made closer, since the discriminative directions of the source and target domains may still be much different. In order to solve the problem, our idea comes that joint distribution adaptation is indispensable. It makes the classifier trained in the source domain perform well in the target domain. Specifically, a measure for joint distribution discrepancy (JDD) between the source and target is proposed. Experiments demonstrate that it can align source and target data according to the class they belong to. It has a direct relationship with classification accuracy and works well for transferring. Secondly, a deep neural network with joint distribution matching for zero-training motor imagery BCI is proposed. It explores both marginal and joint distribution adaptation to alleviate distribution discrepancy across subjects and obtain effective and generalized features in an aligned common space. Visualizations of intermediate layers illustrate how and why the network works well. Experiments on the two datasets prove the effectiveness and strength compared to outstanding counterparts.


2016 ◽  
Vol 7 ◽  
Author(s):  
Luz M. Alonso-Valerdi ◽  
David A. Gutiérrez-Begovich ◽  
Janet Argüello-García ◽  
Francisco Sepulveda ◽  
Ricardo A. Ramírez-Mendoza

2021 ◽  
Author(s):  
Joseph O'Neill ◽  
Jenario Johnson ◽  
Rutledge Detyens ◽  
Roberto W. Batista ◽  
Sorinel Oprisan ◽  
...  

Author(s):  
Lorenza Brusini ◽  
Francesca Stival ◽  
Francesco Setti ◽  
Emanuele Menegatti ◽  
Gloria Menegaz ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document