A Novel Deep Learning Scheme for Motor Imagery EEG Decoding Based on Spatial Representation Fusion

Motor imagery electroencephalography (MI-EEG), which is an important subfield of active brain-computer interface (BCI) systems, can be applied to help disabled people to consciously and directly control prosthesis or external devices, aiding them in certain daily activities. However, the low signal-...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE access Ročník 8; s. 202100 - 202110
Hlavní autori: Yang, Jun, Ma, Zhengmin, Wang, Jin, Fu, Yunfa
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2169-3536, 2169-3536
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Motor imagery electroencephalography (MI-EEG), which is an important subfield of active brain-computer interface (BCI) systems, can be applied to help disabled people to consciously and directly control prosthesis or external devices, aiding them in certain daily activities. However, the low signal-to-noise ratio and spatial resolution make MI-EEG decoding a challenging task. Recently, some deep neural approaches have shown good improvements over state-of-the-art BCI methods. In this study, an end-to-end scheme that includes a multi-layer convolution neural network is constructed for an accurate spatial representation of multi-channel grouped MI-EEG signals, which is employed to extract the useful information present in a multi-channel MI signal. Then the invariant spatial representations are captured from across-subjects training for enhancing the generalization capability through a stacked sparse autoencoder framework, which is inspired by representative deep learning models. Furthermore, a quantitative experimental analysis is conducted on our private dataset and on a public BCI competition dataset. The results show the effectiveness and significance of the proposed methodology.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3035347