scholarly journals The Study of Influence of Sound on Visual ERP-Based Brain Computer Interface

Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1203 ◽  
Author(s):  
Guizhi Xu ◽  
Yuwei Wu ◽  
Mengfan Li

The performance of the event-related potential (ERP)-based brain–computer interface (BCI) declines when applying it into the real environment, which limits the generality of the BCI. The sound is a common noise in daily life, and whether it has influence on this decline is unknown. This study designs a visual-auditory BCI task that requires the subject to focus on the visual interface to output commands and simultaneously count number according to an auditory story. The story is played at three speeds to cause different workloads. Data collected under the same or different workloads are used to train and test classifiers. The results show that when the speed of playing the story increases, the amplitudes of P300 and N200 potentials decrease by 0.86 μV (p = 0.0239) and 0.69 μV (p = 0.0158) in occipital-parietal area, leading to a 5.95% decline (p = 0.0101) of accuracy and 9.53 bits/min decline (p = 0.0416) of information transfer rate. The classifier that is trained by the high workload data achieves higher accuracy than the one trained by the low workload if using the high workload data to test the performance. The result indicates that the sound could affect the visual ERP-BCI by increasing the workload. The large similarity of the training data and testing data is as important as the amplitudes of the ERP on obtaining high performance, which gives us an insight on how make to the ERP-BCI generalized.

2018 ◽  
Vol 28 (10) ◽  
pp. 1850034 ◽  
Author(s):  
Wei Li ◽  
Mengfan Li ◽  
Huihui Zhou ◽  
Genshe Chen ◽  
Jing Jin ◽  
...  

Increasing command generation rate of an event-related potential-based brain-robot system is challenging, because of limited information transfer rate of a brain-computer interface system. To improve the rate, we propose a dual stimuli approach that is flashing a robot image and is scanning another robot image simultaneously. Two kinds of event-related potentials, N200 and P300 potentials, evoked in this dual stimuli condition are decoded by a convolutional neural network. Compared with the traditional approaches, this proposed approach significantly improves the online information transfer rate from 23.0 or 17.8 to 39.1 bits/min at an accuracy of 91.7%. These results suggest that combining multiple types of stimuli to evoke distinguishable ERPs might be a promising direction to improve the command generation rate in the brain-computer interface.


2018 ◽  
Vol 30 (03) ◽  
pp. 1850022 ◽  
Author(s):  
Rajesh Singla

The advancements in the field of brain–computer interface (BCI) are driven by the underlying motive of improving quality of life for both healthy as well as locked in subjects. Since BCI’s are based on the response of the human brain to training or external stimuli, the improvement in terms of performance can be achieved by either enhancing the subject training procedure or by improving the external stimuli to produce maximized event related potential (ERP). P300 and steady-state visually evoked potential (SSVEP) approaches have been the most common paradigms used for stimulus-based BCI’s world over. But recently, a large number of researchers are facing a problem of BCI illiteracy in subjects, where some of the subjects showed ineffective results while training with these BCI as independent stimuli. The concept of hybrid brain–computer interface (hBCI) is a step towards eradicating this problem. Our research deals with external stimuli-based ERP generation where we discuss and compare with experimentation, three different options of visual stimulus: conventional SSVEP stimulus, P300-SSVEP hybrid stimulus, distinct target colors for P300-SSVEP-based hybrid stimulus. This paper introduces a novel hBCI paradigm and discusses the validation of improved results by comparing with the already existing stimuli options. The parameters of comparison that were considered to validate our proposal were decision accuracy (Acc), information transfer rate (ITR) and false activation rate (FAR).


2016 ◽  
Vol 26 (01) ◽  
pp. 1650001 ◽  
Author(s):  
Erwei Yin ◽  
Timothy Zeyl ◽  
Rami Saab ◽  
Dewen Hu ◽  
Zongtan Zhou ◽  
...  

Most P300 event-related potential (ERP)-based brain–computer interface (BCI) studies focus on gaze shift-dependent BCIs, which cannot be used by people who have lost voluntary eye movement. However, the performance of visual saccade-independent P300 BCIs is generally poor. To improve saccade-independent BCI performance, we propose a bimodal P300 BCI approach that simultaneously employs auditory and tactile stimuli. The proposed P300 BCI is a vision-independent system because no visual interaction is required of the user. Specifically, we designed a direction-congruent bimodal paradigm by randomly and simultaneously presenting auditory and tactile stimuli from the same direction. Furthermore, the channels and number of trials were tailored to each user to improve online performance. With 12 participants, the average online information transfer rate (ITR) of the bimodal approach improved by 45.43% and 51.05% over that attained, respectively, with the auditory and tactile approaches individually. Importantly, the average online ITR of the bimodal approach, including the break time between selections, reached 10.77 bits/min. These findings suggest that the proposed bimodal system holds promise as a practical visual saccade-independent P300 BCI.


2020 ◽  
Vol 20 (3) ◽  
pp. 743-757
Author(s):  
Teng Ma ◽  
Xuezhuan Zhao

The chromatic transient visual evoked potential (CTVEP)-based brain-computer interface (BCI) can provide safer and more comfortable stimuli than the traditional VEP-based BCIs due to its low frequency change and no luminance variation in the visual stimulation. However, it still generates relatively few codes that correspond to input commands to control the outside devices, which limits its application in the practical BCIs to some extent. Aiming to obtain more codes, we firstly proposes a new time coding technique to CTVEP-based BCI by utilizing a combination of two 4-bit binary codes to construct four 8-bit binary codes to increase the control commands to extend its application in practice. In the experiment, two time-encoded isoluminant chromatic stimuli are combined to serve as different commands for BCI control, and the results show that the high performance based on the new time coding approach with the average accuracy up to 90.28% and average information transfer rate up to 27.78 bits/min for BCI can be achieved. It turns out that the BCI system based on the proposed method is feasible, stable and efficient, which makes the method very suitable for the practical application of BCIs, such as military, entertainment and medical enterprise.


Micromachines ◽  
2019 ◽  
Vol 10 (10) ◽  
pp. 681
Author(s):  
Bor-Shyh Lin ◽  
Bor-Shing Lin ◽  
Tzu-Hsiang Yen ◽  
Chien-Chin Hsu ◽  
Yao-Chin Wang

Brain–computer interface (BCI) is a system that allows people to communicate directly with external machines via recognizing brain activities without manual operation. However, for most current BCI systems, conventional electroencephalography (EEG) machines and computers are usually required to acquire EEG signal and translate them into control commands, respectively. The sizes of the above machines are usually large, and this increases the limitation for daily applications. Moreover, conventional EEG electrodes also require conductive gels to improve the EEG signal quality. This causes discomfort and inconvenience of use, while the conductive gels may also encounter the problem of drying out during prolonged measurements. In order to improve the above issues, a wearable headset with steady-state visually evoked potential (SSVEP)-based BCI is proposed in this study. Active dry electrodes were designed and implemented to acquire a good EEG signal quality without conductive gels from the hairy site. The SSVEP BCI algorithm was also implemented into the designed field-programmable gate array (FPGA)-based BCI module to translate SSVEP signals into control commands in real time. Moreover, a commercial tablet was used as the visual stimulus device to provide graphic control icons. The whole system was designed as a wearable device to improve convenience of use in daily life, and it could acquire and translate EEG signal directly in the front-end headset. Finally, the performance of the proposed system was validated, and the results showed that it had excellent performance (information transfer rate = 36.08 bits/min).


2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Shih Chung Chen ◽  
Aaron Raymond See ◽  
Yeou Jiunn Chen ◽  
Chia Hong Yeng ◽  
Chih Kuo Liang

People suffering from paralysis caused by serious neural disorder or spinal cord injury also need to be given a means of recreation other than general living aids. Although there have been a proliferation of brain computer interface (BCI) applications, developments for recreational activities are scarcely seen. The objective of this study is to develop a BCI-based remote control integrated with commercial devices such as the remote controlled Air Swimmer. The brain is visually stimulated using boxes flickering at preprogrammed frequencies to activate a brain response. After acquiring and processing these brain signals, the frequency of the resulting peak, which corresponds to the user’s selection, is determined by a decision model. Consequently, a command signal is sent from the computer to the wireless remote controller via a data acquisition (DAQ) module. A command selection training (CST) and simulated path test (SPT) were conducted by 12 subjects using the BCI control system and the experimental results showed a recognition accuracy rate of 89.51% and 92.31% for the CST and SPT, respectively. The fastest information transfer rate demonstrated a response of 105 bits/min and 41.79 bits/min for the CST and SPT, respectively. The BCI system was proven to be able to provide a fast and accurate response for a remote controller application.


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Ying Mao ◽  
Jing Jin ◽  
Shurui Li ◽  
Yangyang Miao ◽  
Andrzej Cichocki

Tactile perception, the primary sensing channel of the tactile brain-computer interface (BCI), is a complicated process. Skin friction plays a vital role in tactile perception. This study aimed to examine the effects of skin friction on tactile P300 BCI performance. Two kinds of oddball paradigms were designed, silk-stim paradigm (SSP) and linen-stim paradigm (LSP), in which silk and linen were wrapped on target vibration motors, respectively. In both paradigms, the disturbance vibrators were wrapped in cotton. The experimental results showed that LSP could induce stronger event-related potentials (ERPs) and achieved a higher classification accuracy and information transfer rate (ITR) compared with SSP. The findings indicate that high skin friction can achieve high performance in tactile BCI. This work provides a novel research direction and constitutes a viable basis for the future tactile P300 BCI, which may benefit patients with visual impairments.


Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4578
Author(s):  
Jihyeon Ha ◽  
Sangin Park ◽  
Chang-Hwan Im ◽  
Laehyun Kim

Assistant devices such as meal-assist robots aid individuals with disabilities and support the elderly in performing daily activities. However, existing meal-assist robots are inconvenient to operate due to non-intuitive user interfaces, requiring additional time and effort. Thus, we developed a hybrid brain–computer interface-based meal-assist robot system following three features that can be measured using scalp electrodes for electroencephalography. The following three procedures comprise a single meal cycle. (1) Triple eye-blinks (EBs) from the prefrontal channel were treated as activation for initiating the cycle. (2) Steady-state visual evoked potentials (SSVEPs) from occipital channels were used to select the food per the user’s intention. (3) Electromyograms (EMGs) were recorded from temporal channels as the users chewed the food to mark the end of a cycle and indicate readiness for starting the following meal. The accuracy, information transfer rate, and false positive rate during experiments on five subjects were as follows: accuracy (EBs/SSVEPs/EMGs) (%): (94.67/83.33/97.33); FPR (EBs/EMGs) (times/min): (0.11/0.08); ITR (SSVEPs) (bit/min): 20.41. These results revealed the feasibility of this assistive system. The proposed system allows users to eat on their own more naturally. Furthermore, it can increase the self-esteem of disabled and elderly peeople and enhance their quality of life.


2020 ◽  
Vol 37 (5) ◽  
pp. 831-837
Author(s):  
Mesut Melek ◽  
Negin Manshouri ◽  
Temel Kayikcioglu

Detailed In the brain-computer interface system (BCI), electroencephalography (EEG) signals are converted into digital signals and analyzed, allowing direct communication between humans and the electronic devices around them. The convenience of the user and the speed of communication with the surrounding devices are the most important challenges of BCI systems. The Emotiv Epoc headset minimizes the discomfort of the user thanks to its wet electrodes and easy handling. In the continuation of our previous works, in this paper, we developed our BCI system based on the gaze at the rotating vanes using the inexpensive Emotiv Epoc headset. In addition to user comfort, our design has an acceptable mean accuracy rate (ACC) and mean information transfer rate (ITR) compared to similar systems.


Sign in / Sign up

Export Citation Format

Share Document