A spatial-frequency-temporal 3D convolutional neural network for motor imagery EEG signal classification

Motor imagery (MI) EEG signal classification is a critical issue for brain–computer interface (BCI) systems. In traditional MI EEG machine learning algorithms, feature extraction and classification often have different objective functions, thus resulting in information loss. To solve this problem, a...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Signal, image and video processing Ročník 15; číslo 8; s. 1797 - 1804
Hlavní autori: Miao, Minmin, Hu, Wenjun, Zhang, Wenbin
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: London Springer London 01.11.2021
Springer Nature B.V
Predmet:
ISSN:1863-1703, 1863-1711
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Motor imagery (MI) EEG signal classification is a critical issue for brain–computer interface (BCI) systems. In traditional MI EEG machine learning algorithms, feature extraction and classification often have different objective functions, thus resulting in information loss. To solve this problem, a novel spatial-frequency-temporal (SFT) 3D CNN model is proposed. Specifically, the energies of EEG signals located in multiple local SFT ranges are extracted to obtain a novel 3D MI EEG feature representation, and a novel 3D CNN model is designed to simultaneously learn the complex MI EEG features in the entire SFT domains and carry out classification. An extensive experimental study is implemented on two public EEG datasets to evaluate the effectiveness of our method. For BCI Competition III Dataset IVa, the average accuracy rate of five subjects obtained by the proposed method reaches 86.6% and yields 4.1% improvement over the state-of-the-art filter band common spatial pattern (FBCSP) method. For BCI Competition III dataset IIIa, by achieving an average accuracy rate of 91.85%, the proposed method outperforms the state-of-the-art dictionary pair learning (DPL) method by 4.44%.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-021-01924-3