Data Augmentation for Motor Imagery Signal Classification Based on a Hybrid Neural Network
As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation and robot control. Recently, researchers have proposed various methods for feature extraction and classification based on MI signals. The dec...
Gespeichert in:
| Veröffentlicht in: | Sensors (Basel, Switzerland) Jg. 20; H. 16; S. 4485 |
|---|---|
| Hauptverfasser: | , , , , , , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
Switzerland
MDPI AG
11.08.2020
MDPI |
| Schlagworte: | |
| ISSN: | 1424-8220, 1424-8220 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation and robot control. Recently, researchers have proposed various methods for feature extraction and classification based on MI signals. The decoding model based on deep neural networks (DNNs) has attracted significant attention in the field of MI signal processing. Due to the strict requirements for subjects and experimental environments, it is difficult to collect large-scale and high-quality electroencephalogram (EEG) data. However, the performance of a deep learning model depends directly on the size of the datasets. Therefore, the decoding of MI-EEG signals based on a DNN has proven highly challenging in practice. Based on this, we investigated the performance of different data augmentation (DA) methods for the classification of MI data using a DNN. First, we transformed the time series signals into spectrogram images using a short-time Fourier transform (STFT). Then, we evaluated and compared the performance of different DA methods for this spectrogram data. Next, we developed a convolutional neural network (CNN) to classify the MI signals and compared the classification performance of after DA. The Fréchet inception distance (FID) was used to evaluate the quality of the generated data (GD) and the classification accuracy, and mean kappa values were used to explore the best CNN-DA method. In addition, analysis of variance (ANOVA) and paired t-tests were used to assess the significance of the results. The results showed that the deep convolutional generative adversarial network (DCGAN) provided better augmentation performance than traditional DA methods: geometric transformation (GT), autoencoder (AE), and variational autoencoder (VAE) (p < 0.01). Public datasets of the BCI competition IV (datasets 1 and 2b) were used to verify the classification performance. Improvements in the classification accuracies of 17% and 21% (p < 0.01) were observed after DA for the two datasets. In addition, the hybrid network CNN-DCGAN outperformed the other classification methods, with average kappa values of 0.564 and 0.677 for the two datasets. |
|---|---|
| AbstractList | As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation and robot control. Recently, researchers have proposed various methods for feature extraction and classification based on MI signals. The decoding model based on deep neural networks (DNNs) has attracted significant attention in the field of MI signal processing. Due to the strict requirements for subjects and experimental environments, it is difficult to collect large-scale and high-quality electroencephalogram (EEG) data. However, the performance of a deep learning model depends directly on the size of the datasets. Therefore, the decoding of MI-EEG signals based on a DNN has proven highly challenging in practice. Based on this, we investigated the performance of different data augmentation (DA) methods for the classification of MI data using a DNN. First, we transformed the time series signals into spectrogram images using a short-time Fourier transform (STFT). Then, we evaluated and compared the performance of different DA methods for this spectrogram data. Next, we developed a convolutional neural network (CNN) to classify the MI signals and compared the classification performance of after DA. The Fréchet inception distance (FID) was used to evaluate the quality of the generated data (GD) and the classification accuracy, and mean kappa values were used to explore the best CNN-DA method. In addition, analysis of variance (ANOVA) and paired t-tests were used to assess the significance of the results. The results showed that the deep convolutional generative adversarial network (DCGAN) provided better augmentation performance than traditional DA methods: geometric transformation (GT), autoencoder (AE), and variational autoencoder (VAE) (p < 0.01). Public datasets of the BCI competition IV (datasets 1 and 2b) were used to verify the classification performance. Improvements in the classification accuracies of 17% and 21% (p < 0.01) were observed after DA for the two datasets. In addition, the hybrid network CNN-DCGAN outperformed the other classification methods, with average kappa values of 0.564 and 0.677 for the two datasets.As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation and robot control. Recently, researchers have proposed various methods for feature extraction and classification based on MI signals. The decoding model based on deep neural networks (DNNs) has attracted significant attention in the field of MI signal processing. Due to the strict requirements for subjects and experimental environments, it is difficult to collect large-scale and high-quality electroencephalogram (EEG) data. However, the performance of a deep learning model depends directly on the size of the datasets. Therefore, the decoding of MI-EEG signals based on a DNN has proven highly challenging in practice. Based on this, we investigated the performance of different data augmentation (DA) methods for the classification of MI data using a DNN. First, we transformed the time series signals into spectrogram images using a short-time Fourier transform (STFT). Then, we evaluated and compared the performance of different DA methods for this spectrogram data. Next, we developed a convolutional neural network (CNN) to classify the MI signals and compared the classification performance of after DA. The Fréchet inception distance (FID) was used to evaluate the quality of the generated data (GD) and the classification accuracy, and mean kappa values were used to explore the best CNN-DA method. In addition, analysis of variance (ANOVA) and paired t-tests were used to assess the significance of the results. The results showed that the deep convolutional generative adversarial network (DCGAN) provided better augmentation performance than traditional DA methods: geometric transformation (GT), autoencoder (AE), and variational autoencoder (VAE) (p < 0.01). Public datasets of the BCI competition IV (datasets 1 and 2b) were used to verify the classification performance. Improvements in the classification accuracies of 17% and 21% (p < 0.01) were observed after DA for the two datasets. In addition, the hybrid network CNN-DCGAN outperformed the other classification methods, with average kappa values of 0.564 and 0.677 for the two datasets. As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation and robot control. Recently, researchers have proposed various methods for feature extraction and classification based on MI signals. The decoding model based on deep neural networks (DNNs) has attracted significant attention in the field of MI signal processing. Due to the strict requirements for subjects and experimental environments, it is difficult to collect large-scale and high-quality electroencephalogram (EEG) data. However, the performance of a deep learning model depends directly on the size of the datasets. Therefore, the decoding of MI-EEG signals based on a DNN has proven highly challenging in practice. Based on this, we investigated the performance of different data augmentation (DA) methods for the classification of MI data using a DNN. First, we transformed the time series signals into spectrogram images using a short-time Fourier transform (STFT). Then, we evaluated and compared the performance of different DA methods for this spectrogram data. Next, we developed a convolutional neural network (CNN) to classify the MI signals and compared the classification performance of after DA. The Fréchet inception distance (FID) was used to evaluate the quality of the generated data (GD) and the classification accuracy, and mean kappa values were used to explore the best CNN-DA method. In addition, analysis of variance (ANOVA) and paired -tests were used to assess the significance of the results. The results showed that the deep convolutional generative adversarial network (DCGAN) provided better augmentation performance than traditional DA methods: geometric transformation (GT), autoencoder (AE), and variational autoencoder (VAE) ( < 0.01). Public datasets of the BCI competition IV (datasets 1 and 2b) were used to verify the classification performance. Improvements in the classification accuracies of 17% and 21% ( < 0.01) were observed after DA for the two datasets. In addition, the hybrid network CNN-DCGAN outperformed the other classification methods, with average kappa values of 0.564 and 0.677 for the two datasets. As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation and robot control. Recently, researchers have proposed various methods for feature extraction and classification based on MI signals. The decoding model based on deep neural networks (DNNs) has attracted significant attention in the field of MI signal processing. Due to the strict requirements for subjects and experimental environments, it is difficult to collect large-scale and high-quality electroencephalogram (EEG) data. However, the performance of a deep learning model depends directly on the size of the datasets. Therefore, the decoding of MI-EEG signals based on a DNN has proven highly challenging in practice. Based on this, we investigated the performance of different data augmentation (DA) methods for the classification of MI data using a DNN. First, we transformed the time series signals into spectrogram images using a short-time Fourier transform (STFT). Then, we evaluated and compared the performance of different DA methods for this spectrogram data. Next, we developed a convolutional neural network (CNN) to classify the MI signals and compared the classification performance of after DA. The Fréchet inception distance (FID) was used to evaluate the quality of the generated data (GD) and the classification accuracy, and mean kappa values were used to explore the best CNN-DA method. In addition, analysis of variance (ANOVA) and paired t-tests were used to assess the significance of the results. The results showed that the deep convolutional generative adversarial network (DCGAN) provided better augmentation performance than traditional DA methods: geometric transformation (GT), autoencoder (AE), and variational autoencoder (VAE) (p < 0.01). Public datasets of the BCI competition IV (datasets 1 and 2b) were used to verify the classification performance. Improvements in the classification accuracies of 17% and 21% (p < 0.01) were observed after DA for the two datasets. In addition, the hybrid network CNN-DCGAN outperformed the other classification methods, with average kappa values of 0.564 and 0.677 for the two datasets. |
| Author | Ma, Kaiquan Zhang, Sicong Zheng, Xiaowei Xu, Guanghua Duan, Nan Zhang, Kai Han, Zezhen Chen, Longting |
| AuthorAffiliation | 1 School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049, China; zhangkai0912@stu.xjtu.edu.cn (K.Z.); hanzehen@stu.xjtu.edu.cn (Z.H.); mkq1994@stu.xjtu.edu.cn (K.M.); hlydx1314@stu.xjtu.edu.cn (X.Z.); cltdevelop@stu.xjtu.edu.cn (L.C.); shenkong@stu.xjtu.edu.cn (N.D.); zhsicong@mail.xjtu.edu.cn (S.Z.) 2 State Key Laboratory for Manufacturing Systems Engineering, Xi’an Jiaotong University, Xi’an 710049, China |
| AuthorAffiliation_xml | – name: 1 School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049, China; zhangkai0912@stu.xjtu.edu.cn (K.Z.); hanzehen@stu.xjtu.edu.cn (Z.H.); mkq1994@stu.xjtu.edu.cn (K.M.); hlydx1314@stu.xjtu.edu.cn (X.Z.); cltdevelop@stu.xjtu.edu.cn (L.C.); shenkong@stu.xjtu.edu.cn (N.D.); zhsicong@mail.xjtu.edu.cn (S.Z.) – name: 2 State Key Laboratory for Manufacturing Systems Engineering, Xi’an Jiaotong University, Xi’an 710049, China |
| Author_xml | – sequence: 1 givenname: Kai surname: Zhang fullname: Zhang, Kai – sequence: 2 givenname: Guanghua surname: Xu fullname: Xu, Guanghua – sequence: 3 givenname: Zezhen surname: Han fullname: Han, Zezhen – sequence: 4 givenname: Kaiquan surname: Ma fullname: Ma, Kaiquan – sequence: 5 givenname: Xiaowei orcidid: 0000-0002-8653-7129 surname: Zheng fullname: Zheng, Xiaowei – sequence: 6 givenname: Longting surname: Chen fullname: Chen, Longting – sequence: 7 givenname: Nan surname: Duan fullname: Duan, Nan – sequence: 8 givenname: Sicong surname: Zhang fullname: Zhang, Sicong |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/32796607$$D View this record in MEDLINE/PubMed |
| BookMark | eNptkk1v1DAQhiNURD_gwB9AkbjAYalje-34glSWQlcq5QBcuFgTexy8JHGxE6r995hmWbUVF3s0fub1fB0XB0MYsCieV-QNY4qcJkoqwXm9fFQcVZzyRU0pObhjHxbHKW0IoYyx-klxyKhUQhB5VHx_DyOUZ1Pb4zDC6MNQuhDLT2HM57qHFuO2_OLbAbpy1UFK3nkzc-8goS2zAeXFtonellc4xcxd4XgT4s-nxWMHXcJnu_uk-Pbh_OvqYnH5-eN6dXa5MFyocVHbpRKMEFU1gEBRWOmcFOiy0zluDZBGGQkNt6RpFEqwppGWKeUIVYqxk2I969oAG30dfQ9xqwN4fesIsdUQR2861E6CVCiUpYJyJpY5WgKtcakUdwwga72dta6npkdrclNyRfdE778M_oduw28tueScyizwaicQw68J06h7nwx2HQwYpqTzt5xLQW_Rlw_QTZhibvSOooRWVaZe3M1on8q_EWbg9QyYGFKK6PZIRfTf9dD79cjs6QPW-HnquRjf_SfiD_iQu08 |
| CitedBy_id | crossref_primary_10_3390_s23041932 crossref_primary_10_1186_s42490_024_00080_2 crossref_primary_10_3390_info13090406 crossref_primary_10_3390_s20205736 crossref_primary_10_1088_1741_2552_ad6598 crossref_primary_10_3390_brainsci11010075 crossref_primary_10_1016_j_asoc_2024_111771 crossref_primary_10_1007_s00521_023_08927_w crossref_primary_10_1016_j_bspc_2022_104349 crossref_primary_10_1177_09287329241291367 crossref_primary_10_1007_s00521_021_06352_5 crossref_primary_10_1016_j_aei_2024_102434 crossref_primary_10_3390_ai6090230 crossref_primary_10_3390_app12052598 crossref_primary_10_1007_s00521_025_11346_8 crossref_primary_10_1007_s00521_024_09731_w crossref_primary_10_1007_s11042_024_20510_6 crossref_primary_10_1016_j_bspc_2022_103614 crossref_primary_10_3390_s21196503 crossref_primary_10_1186_s13634_024_01188_2 crossref_primary_10_3390_s23187694 crossref_primary_10_3390_machines13010071 crossref_primary_10_1002_asjc_3000 crossref_primary_10_1049_2024_5596468 crossref_primary_10_1109_ACCESS_2025_3563623 crossref_primary_10_1016_j_bspc_2021_103052 crossref_primary_10_3389_fninf_2025_1521805 crossref_primary_10_3389_frobt_2024_1362735 crossref_primary_10_1109_MSP_2023_3278074 crossref_primary_10_1155_2022_2047576 crossref_primary_10_3390_app10207208 crossref_primary_10_1007_s11042_023_15900_1 crossref_primary_10_1016_j_compscitech_2021_108713 crossref_primary_10_1080_03091902_2025_2463577 crossref_primary_10_1038_s41598_022_08490_9 crossref_primary_10_1371_journal_pone_0311942 crossref_primary_10_1016_j_jneumeth_2022_109736 crossref_primary_10_1109_ACCESS_2021_3126345 crossref_primary_10_1109_TIM_2024_3522618 crossref_primary_10_1093_cercor_bhad511 crossref_primary_10_1007_s11760_023_02808_4 crossref_primary_10_3389_fnhum_2024_1430086 crossref_primary_10_1109_ACCESS_2025_3604528 crossref_primary_10_3389_fnhum_2021_765525 crossref_primary_10_3390_app10248934 crossref_primary_10_3389_fnhum_2024_1421922 crossref_primary_10_1038_s41598_022_26882_9 crossref_primary_10_1109_ACCESS_2020_3023970 crossref_primary_10_1002_ima_22913 crossref_primary_10_1109_TII_2020_3044310 crossref_primary_10_1088_2632_2153_ad200c crossref_primary_10_1007_s11517_023_02857_4 crossref_primary_10_1016_j_bspc_2022_103718 crossref_primary_10_3390_electronics12244944 crossref_primary_10_1016_j_ymssp_2025_112561 crossref_primary_10_1088_2057_1976_ac4c28 crossref_primary_10_1007_s00521_025_11403_2 crossref_primary_10_3390_s20216321 crossref_primary_10_1007_s00170_024_14098_2 crossref_primary_10_1016_j_measurement_2024_114795 crossref_primary_10_3390_brainsci14040375 crossref_primary_10_3389_fnhum_2023_1292428 crossref_primary_10_3389_fnins_2023_1219133 crossref_primary_10_1109_TNSRE_2023_3243992 crossref_primary_10_1109_ACCESS_2024_3421569 crossref_primary_10_1109_ACCESS_2023_3299497 crossref_primary_10_30773_pi_2025_0133 crossref_primary_10_3390_s25103178 crossref_primary_10_3389_fnins_2022_988535 crossref_primary_10_3390_life12030374 crossref_primary_10_3389_fnhum_2021_643386 crossref_primary_10_1088_1741_2552_ac4430 |
| Cites_doi | 10.1109/TBME.2014.2312397 10.1016/j.apmr.2014.08.024 10.1109/SMC.2019.8914246 10.1038/nature14539 10.1088/1741-2552/ab260c 10.1007/978-3-319-73600-6_8 10.3389/fnbot.2019.00023 10.1109/CVPR.2019.00020 10.1109/TPAMI.2015.2496141 10.1088/1741-2552/ab0ab5 10.1093/ietisy/e91-d.1.44 10.1016/S1388-2457(99)00141-8 10.1016/j.neucom.2018.09.013 10.3390/s19061423 10.1109/CVPR.2016.90 10.1109/LSP.2017.2657381 10.1007/978-3-642-02091-9_18 10.1088/1741-2552/ab405f 10.1088/1741-2552/ab57c0 10.3390/s19030551 10.1109/CVPR.2015.7298594 10.1002/ima.22405 10.1186/s40537-019-0197-0 10.1109/TMI.2016.2538465 10.1109/ACCESS.2017.2696121 10.1109/IJCNN.2017.7966388 10.3390/s19071736 10.1016/j.brainresrev.2008.12.024 10.1109/ICAEE48663.2019.8975578 10.1109/TBME.2006.888836 10.1109/IWW-BCI.2019.8737345 10.3390/s120201211 10.1038/s41598-017-09597-0 10.1111/mice.12458 10.1109/BigData.2018.8622525 10.1109/TASSP.1984.1164317 10.1109/JSEN.2019.2899645 10.1109/NER.2013.6695959 10.1109/5.939829 10.1109/CONIELECOMP.2018.8327170 10.1016/j.compbiomed.2019.02.009 10.1016/j.compind.2019.01.001 10.1088/1741-2560/14/1/016003 10.1002/hbm.23730 10.1097/WNR.0b013e32834ca58d 10.1109/TBME.2015.2402252 10.1109/ACCESS.2019.2895133 10.1109/TMI.2016.2536809 10.1109/TNSRE.2019.2915621 10.3390/s18041136 10.1088/1741-2552/aace8c 10.1109/SSCI.2018.8628917 10.1109/IJCNN.2018.8489727 10.1145/3057280 10.1109/ICDAR.2015.7333881 10.1016/j.compbiomed.2004.05.001 10.1016/j.cell.2019.04.005 |
| ContentType | Journal Article |
| Copyright | 2020. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2020 by the authors. 2020 |
| Copyright_xml | – notice: 2020. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2020 by the authors. 2020 |
| DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7X7 7XB 88E 8FI 8FJ 8FK ABUWG AFKRA AZQEC BENPR CCPQU DWQXO FYUFA GHDGH K9. M0S M1P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQQKQ PQUKI PRINS 7X8 5PM DOA |
| DOI | 10.3390/s20164485 |
| DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) ProQuest Health & Medical ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) ProQuest Hospital Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central ProQuest One ProQuest Central Proquest Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Health & Medical Complete (Alumni) ProQuest Health & Medical Collection Medical Database Proquest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic (retired) ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Central China ProQuest Central ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Health & Medical Research Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic MEDLINE Publicly Available Content Database CrossRef |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: PIMPY name: ProQuest Publicly Available Content Database url: http://search.proquest.com/publiccontent sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 1424-8220 |
| ExternalDocumentID | oai_doaj_org_article_f7a79e69d26243659337a28e5994f3aa PMC7474427 32796607 10_3390_s20164485 |
| Genre | Journal Article |
| GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 51775415 – fundername: National Key Research & Development Plan of China grantid: 2017YFC1308500 – fundername: Key Research & Development Plan of Shaanxi Province grantid: 2018ZDCXL-GY-06-01 |
| GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFFHD AFKRA AFZYC ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PJZUB PPXIY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M 3V. ABJCF ALIPV ARAPS CGR CUY CVF ECM EIF HCIFZ KB. M7S NPM PDBOC 7XB 8FK AZQEC DWQXO K9. PKEHL PQEST PQUKI PRINS 7X8 PUEGO 5PM |
| ID | FETCH-LOGICAL-c469t-8d59630091baea2e6d7ff76ef630ff4dca0b9c7ab4d0bb9e7adcb7d399f029933 |
| IEDL.DBID | BENPR |
| ISICitedReferencesCount | 84 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000564803800001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1424-8220 |
| IngestDate | Fri Oct 03 12:52:59 EDT 2025 Tue Nov 04 01:49:32 EST 2025 Thu Sep 04 20:06:35 EDT 2025 Tue Oct 07 07:18:13 EDT 2025 Wed Feb 19 02:01:58 EST 2025 Tue Nov 18 21:26:38 EST 2025 Sat Nov 29 07:13:13 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 16 |
| Keywords | DCGAN motor imagery CNN data augmentation classification |
| Language | English |
| License | Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c469t-8d59630091baea2e6d7ff76ef630ff4dca0b9c7ab4d0bb9e7adcb7d399f029933 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0002-8653-7129 |
| OpenAccessLink | https://www.proquest.com/docview/2434420211?pq-origsite=%requestingapplication% |
| PMID | 32796607 |
| PQID | 2434420211 |
| PQPubID | 2032333 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_f7a79e69d26243659337a28e5994f3aa pubmedcentral_primary_oai_pubmedcentral_nih_gov_7474427 proquest_miscellaneous_2434476227 proquest_journals_2434420211 pubmed_primary_32796607 crossref_primary_10_3390_s20164485 crossref_citationtrail_10_3390_s20164485 |
| PublicationCentury | 2000 |
| PublicationDate | 20200811 |
| PublicationDateYYYYMMDD | 2020-08-11 |
| PublicationDate_xml | – month: 8 year: 2020 text: 20200811 day: 11 |
| PublicationDecade | 2020 |
| PublicationPlace | Switzerland |
| PublicationPlace_xml | – name: Switzerland – name: Basel |
| PublicationTitle | Sensors (Basel, Switzerland) |
| PublicationTitleAlternate | Sensors (Basel) |
| PublicationYear | 2020 |
| Publisher | MDPI AG MDPI |
| Publisher_xml | – name: MDPI AG – name: MDPI |
| References | ref_50 Li (ref_27) 2019; 27 Roy (ref_19) 2019; 16 Shorten (ref_34) 2019; 6 Salamon (ref_18) 2017; 24 Ponce (ref_70) 2019; 177 ref_14 ref_57 ref_56 ref_54 ref_53 ref_52 ref_51 ref_16 ref_15 Pereira (ref_22) 2016; 35 ref_25 ref_67 ref_66 Zhang (ref_28) 2019; 7 ref_20 ref_64 ref_63 Setio (ref_21) 2016; 35 ref_29 (ref_45) 2005; 35 Woehrle (ref_11) 2015; 62 Dai (ref_26) 2020; 17 Pfurtscheller (ref_46) 2001; 89 Gao (ref_40) 2019; 34 Yuan (ref_1) 2014; 61 ref_72 ref_71 Lemley (ref_47) 2017; 5 Lozano (ref_61) 2009; 32 (ref_2) 2012; 12 ref_36 ref_35 Ang (ref_62) 2012; 6 Vernon (ref_68) 2018; 15 ref_33 ref_32 Shao (ref_23) 2019; 106 ref_31 Freer (ref_30) 2019; 17 ref_73 ref_39 ref_38 ref_37 Bonassi (ref_3) 2017; 7 Diamant (ref_24) 2018; 321 Phothisonothai (ref_7) 2008; 91 Vidaurre (ref_10) 2007; 54 Anderson (ref_6) 2011; 22 Chaudhary (ref_69) 2019; 19 Craik (ref_60) 2019; 16 Coyle (ref_5) 2015; 96 LeCun (ref_17) 2015; 521 Duan (ref_12) 2019; 13 Dosovitskiy (ref_55) 2015; 38 Munzert (ref_4) 2009; 60 Tabar (ref_58) 2016; 14 ref_42 ref_41 Schirrmeister (ref_59) 2017; 38 ref_49 ref_48 ref_9 ref_8 Song (ref_13) 2017; 35 Pfurtscheller (ref_43) 1999; 110 Malan (ref_65) 2019; 107 Griffin (ref_44) 1984; 32 |
| References_xml | – volume: 61 start-page: 1425 year: 2014 ident: ref_1 article-title: Brain-computer interfaces using sensorimotor rhythms: Current state and future perspectives publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2014.2312397 – volume: 96 start-page: S62 year: 2015 ident: ref_5 article-title: Sensorimotor Modulation Assessment and Brain-Computer Interface Training in Disorders of Consciousness publication-title: Arch. Phys. Med. Rehabil. doi: 10.1016/j.apmr.2014.08.024 – ident: ref_16 doi: 10.1109/SMC.2019.8914246 – volume: 521 start-page: 436 year: 2015 ident: ref_17 article-title: Deep learning publication-title: Nature doi: 10.1038/nature14539 – volume: 16 start-page: 051001 year: 2019 ident: ref_19 article-title: Deep learning-based electroencephalography analysis: A systematic review publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ab260c – ident: ref_37 doi: 10.1007/978-3-319-73600-6_8 – volume: 32 start-page: 569 year: 2009 ident: ref_61 article-title: Sensitivity Analysis of k-Fold Cross Validation in Prediction Error Estimation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 13 start-page: 23 year: 2019 ident: ref_12 article-title: Quadcopter Flight Control Using a Non-invasive Multi-Modal Brain Computer Interface. Frontiers publication-title: Neurorobotics doi: 10.3389/fnbot.2019.00023 – ident: ref_48 doi: 10.1109/CVPR.2019.00020 – ident: ref_51 – volume: 38 start-page: 1734 year: 2015 ident: ref_55 article-title: Discriminative Unsupervised Feature Learning with Exemplar Convolutional Neural Networks publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2015.2496141 – volume: 16 start-page: 031001 year: 2019 ident: ref_60 article-title: Deep learning for electroencephalogram (EEG) classification tasks: A review publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ab0ab5 – ident: ref_39 – volume: 91 start-page: 44 year: 2008 ident: ref_7 article-title: EEG-based classification of motor imagery tasks using fractal dimension and neural network for brain-computer interface publication-title: IEICE Trans. Inf. Syst. doi: 10.1093/ietisy/e91-d.1.44 – ident: ref_42 – ident: ref_35 – volume: 110 start-page: 1842 year: 1999 ident: ref_43 article-title: Event-related EEG/MEG synchronization and desynchronization: Basic principles publication-title: Clin. Neurophysiol. doi: 10.1016/S1388-2457(99)00141-8 – volume: 321 start-page: 321 year: 2018 ident: ref_24 article-title: GAN-based synthetic medical image augmentation for increased CNN performance in liver lesion classification publication-title: Neurocomputing doi: 10.1016/j.neucom.2018.09.013 – ident: ref_71 doi: 10.3390/s19061423 – ident: ref_52 – ident: ref_38 doi: 10.1109/CVPR.2016.90 – volume: 24 start-page: 279 year: 2017 ident: ref_18 article-title: Deep convolutional neural networks and data augmentation for environmental sound classification publication-title: IEEE Signal. Process. Lett. doi: 10.1109/LSP.2017.2657381 – ident: ref_41 – ident: ref_8 doi: 10.1007/978-3-642-02091-9_18 – volume: 17 start-page: 016025 year: 2020 ident: ref_26 article-title: HS-CNN: A CNN with hybrid convolution scale for EEG motor imagery classification publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ab405f – volume: 17 start-page: 016041 year: 2019 ident: ref_30 article-title: Data augmentation for self-paced motor imagery classification with C-LSTM publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ab57c0 – ident: ref_64 doi: 10.3390/s19030551 – ident: ref_57 doi: 10.1109/CVPR.2015.7298594 – ident: ref_63 doi: 10.1002/ima.22405 – volume: 6 start-page: 1 year: 2019 ident: ref_34 article-title: A survey on Image Data Augmentation for Deep Learning publication-title: J. Big Data doi: 10.1186/s40537-019-0197-0 – volume: 35 start-page: 1240 year: 2016 ident: ref_22 article-title: Brain tumor segmentation using convolutional neural networks in MRI images publication-title: IEEE Trans. Med. Imaging doi: 10.1109/TMI.2016.2538465 – volume: 5 start-page: 5858 year: 2017 ident: ref_47 article-title: Smart augmentation learning an optimal data augmentation strategy publication-title: IEEE Access doi: 10.1109/ACCESS.2017.2696121 – ident: ref_66 doi: 10.1109/IJCNN.2017.7966388 – ident: ref_31 doi: 10.3390/s19071736 – ident: ref_53 – volume: 6 start-page: 39 year: 2012 ident: ref_62 article-title: Filter Bank Common Spatial Pattern Algorithm on BCI Competition IV Datasets 2a and 2b publication-title: Front. Behav. Neurosci. – volume: 60 start-page: 306 year: 2009 ident: ref_4 article-title: Cognitive motor processes: The role of motor imagery in the study of motor representations publication-title: Brain Res. Rev. doi: 10.1016/j.brainresrev.2008.12.024 – ident: ref_32 doi: 10.1109/ICAEE48663.2019.8975578 – volume: 54 start-page: 550 year: 2007 ident: ref_10 article-title: Study of on-line adaptive discriminant analysis for EEG-based brain computer interface publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2006.888836 – ident: ref_29 doi: 10.1109/IWW-BCI.2019.8737345 – volume: 12 start-page: 1211 year: 2012 ident: ref_2 article-title: Brain computer interfaces, a review publication-title: Sensors doi: 10.3390/s120201211 – volume: 7 start-page: 1 year: 2017 ident: ref_3 article-title: Provision of somatosensory inputs during motor imagery enhances learning-induced plasticity in human motor cortex publication-title: Sci. Rep. doi: 10.1038/s41598-017-09597-0 – volume: 34 start-page: 755 year: 2019 ident: ref_40 article-title: Deep leaf-bootstrapping generative adversarial network for structural image data augmentation publication-title: Comput. Civ. Infrastruct. Eng. doi: 10.1111/mice.12458 – ident: ref_56 doi: 10.1109/BigData.2018.8622525 – volume: 32 start-page: 236 year: 1984 ident: ref_44 article-title: Signal estimation from modified short-time Fourier transform publication-title: IEEE Trans. Acoust. SpeechSignal Process. doi: 10.1109/TASSP.1984.1164317 – ident: ref_67 – volume: 19 start-page: 4494 year: 2019 ident: ref_69 article-title: Convolutional Neural Network Based Approach Towards Motor Imagery Tasks EEG Signals Classification publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2019.2899645 – ident: ref_14 – ident: ref_9 doi: 10.1109/NER.2013.6695959 – volume: 89 start-page: 1123 year: 2001 ident: ref_46 article-title: Motor imagery and direct brain-computer communication publication-title: Proc. IEEE doi: 10.1109/5.939829 – ident: ref_73 – ident: ref_20 doi: 10.1109/CONIELECOMP.2018.8327170 – volume: 107 start-page: 118 year: 2019 ident: ref_65 article-title: Feature selection using regularized neighbourhood component analysis to enhance the classification performance of motor imagery signals publication-title: Comput. Boil. Med. doi: 10.1016/j.compbiomed.2019.02.009 – volume: 106 start-page: 85 year: 2019 ident: ref_23 article-title: Generative adversarial networks for data augmentation in machine fault diagnosis publication-title: Comput. Ind. doi: 10.1016/j.compind.2019.01.001 – volume: 14 start-page: 16003 year: 2016 ident: ref_58 article-title: A novel deep learning approach for classification of EEG motor imagery signals publication-title: J. Neural Eng. doi: 10.1088/1741-2560/14/1/016003 – volume: 38 start-page: 5391 year: 2017 ident: ref_59 article-title: Deep learning with convolutional neural networks for EEG decoding and visualization publication-title: Hum. Brain Mapp. doi: 10.1002/hbm.23730 – ident: ref_50 – ident: ref_33 – ident: ref_54 – volume: 22 start-page: 939 year: 2011 ident: ref_6 article-title: Review of motor and phantom-related imagery publication-title: Neuroreport doi: 10.1097/WNR.0b013e32834ca58d – volume: 62 start-page: 1696 year: 2015 ident: ref_11 article-title: An adaptive spatial flter for user-independent single trial detection of event-related potentials publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2015.2402252 – volume: 7 start-page: 15945 year: 2019 ident: ref_28 article-title: A novel deep learning approach with data augmsentation to classify motor imagery signals publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2895133 – volume: 35 start-page: 1160 year: 2016 ident: ref_21 article-title: Pulmonary Nodule Detection in CT Images: False Positive Reduction Using Multi-View Convolutional Networks publication-title: IEEE Trans. Med. Imaging doi: 10.1109/TMI.2016.2536809 – volume: 27 start-page: 1170 year: 2019 ident: ref_27 article-title: A Channel-Projection Mixed-Scale Convolutional Neural Network for Motor Imagery EEG Decoding publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2019.2915621 – ident: ref_72 doi: 10.3390/s18041136 – volume: 15 start-page: 056013 year: 2018 ident: ref_68 article-title: EEGNet: A compact convolutional neural network for EEG-based brain—Computer interfaces publication-title: J. Neural Eng. doi: 10.1088/1741-2552/aace8c – ident: ref_49 doi: 10.1109/SSCI.2018.8628917 – ident: ref_25 doi: 10.1109/IJCNN.2018.8489727 – volume: 35 start-page: 1 year: 2017 ident: ref_13 article-title: DeepMob: Learning deep knowledge of human emergency behavior and mobility from big and heterogeneous data publication-title: ACM Trans. Inf. Syst. (TOIS) doi: 10.1145/3057280 – ident: ref_15 doi: 10.1109/ICDAR.2015.7333881 – ident: ref_36 – volume: 35 start-page: 603 year: 2005 ident: ref_45 article-title: Comparison of STFT and wavelet transform methods in determining epileptic seizure activity in EEG signals for real-time application publication-title: Comput. Boil. Med. doi: 10.1016/j.compbiomed.2004.05.001 – volume: 177 start-page: 999 year: 2019 ident: ref_70 article-title: Evolving Images for Visual Neurons Using a Deep Generative Network Reveals Coding Principles and Neuronal Preferences publication-title: Cell doi: 10.1016/j.cell.2019.04.005 |
| SSID | ssj0023338 |
| Score | 2.5636027 |
| Snippet | As an important paradigm of spontaneous brain-computer interfaces (BCIs), motor imagery (MI) has been widely used in the fields of neurological rehabilitation... |
| SourceID | doaj pubmedcentral proquest pubmed crossref |
| SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
| StartPage | 4485 |
| SubjectTerms | Algorithms Brain-Computer Interfaces Classification CNN data augmentation Datasets DCGAN Electroencephalography Humans Imagination Methods motor imagery Neural networks Neural Networks, Computer Noise Signal processing |
| SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NT9wwEB1ViEM5oEL5SKHIrThwidi1HXt9hAKih64qARLiEvkTVoJsxe5W4t8zk2SjXYTUS2-RPUqc8UTznjJ-A3DYpy5vNhT5IBQylzzE3KXIcyFNEqmnfFSpbjahh8PB7a35vdDqi2rCGnngxnHHSVttojKBKy6FKpCAa8sHsTBGJmFraISoZ06mWqolkHk1OkICSf3xhJOSlKSGyQvZpxbpfw9Zvi2QXMg4F59gvYWK7KRZ4gZ8iNUmrC0ICH6GuzM7texkdv_UHiGqGIJQ9muMTJr9fCJ9ihd2Nbqn-9TtL6kwqLE7xfQVGF5YdvlCx7YY6XSg3bApDN-Cm4vz6x-XedstIfdIcafkbEP6WabvbLQ8qqBT0iomHExJBm97znhtnQw950zUNninAwKU1MOcJMQ2rFTjKu4CK5JPUQYRrcPsryxiRJdsQOZUKF_omMHR3Iulb6XEqaPFY4mUghxedg7P4Htn-qfRz3jP6JS2ojMgyet6AAOhbAOh_FcgZLA_38iy_Q4nJVpKyRHH9DP41k3jF0S_RWwVx7PWBnMC1xnsNPverURwTfKlOKOXImJpqcsz1eihVulGnoaP1l_-x7vtwUdOPJ-kePv7sDJ9nsWvsOr_TkeT54M69F8BZJwKWg priority: 102 providerName: Directory of Open Access Journals |
| Title | Data Augmentation for Motor Imagery Signal Classification Based on a Hybrid Neural Network |
| URI | https://www.ncbi.nlm.nih.gov/pubmed/32796607 https://www.proquest.com/docview/2434420211 https://www.proquest.com/docview/2434476227 https://pubmed.ncbi.nlm.nih.gov/PMC7474427 https://doaj.org/article/f7a79e69d26243659337a28e5994f3aa |
| Volume | 20 |
| WOSCitedRecordID | wos000564803800001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 1424-8220 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023338 issn: 1424-8220 databaseCode: DOA dateStart: 20010101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 1424-8220 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023338 issn: 1424-8220 databaseCode: M~E dateStart: 20010101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVPQU databaseName: Health & Medical Collection (ProQuest) customDbUrl: eissn: 1424-8220 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023338 issn: 1424-8220 databaseCode: 7X7 dateStart: 20010101 isFulltext: true titleUrlDefault: https://search.proquest.com/healthcomplete providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: eissn: 1424-8220 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023338 issn: 1424-8220 databaseCode: BENPR dateStart: 20010101 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Publicly Available Content Database customDbUrl: eissn: 1424-8220 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023338 issn: 1424-8220 databaseCode: PIMPY dateStart: 20010101 isFulltext: true titleUrlDefault: http://search.proquest.com/publiccontent providerName: ProQuest |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Lj9MwEB6xXQ5w4P0oLFVAHLhE29hOXJ_QFrraPbSqeEiFS-T4USqxydIH0l747cwkbmjRiguXKLJHyUhje-Yb298AvE6oypu2aTywqYgFsy4uvGMxF8pz38-My3xdbEJOJoPZTE1Dwm0VjlVu18R6obaVoRz5MRNcCETqSfL28kdMVaNodzWU0DiAQ2IqEx04HI4m0w8t5OKIwBo-IY7g_njFiFFKUOHkHS9Uk_VfF2H-fVByx_Oc3v1fne_BnRBzRifNILkPN1z5AG7vMBE-hK_v9VpHJ5v5RbiLVEYYzUbjCiF5dH5BRBdX0cfFnL5T19GkE0aN3BD9oI3wRUdnV3T_KyLCD5SbNCfMH8Hn09Gnd2dxKLsQG8TKa7KaIiIulRTaaeYyK72XmfPY6L2wRvcLZaQuhO0XhXJSW1NIi5GO76Nz4_wxdMqqdE8hSr3xTljudIFhRKYx2Cy8tgjB0syk0nXhzdYMuQmc5FQa43uO2IQslrcW68KrVvSyIeK4TmhItmwFiDu7bqiW8zxMxdxLLZXLlGUZGihLUWOp2cClSgnPte7C0daaeZjQq_yPKbvwsu3GqUj7K7p01SbIoHNhsgtPmoHTasKZJB5U7JF7Q2pP1f2ecvGtpvtGwIe_ls_-rdZzuMUoFUBsvckRdNbLjXsBN83P9WK17MGBnMn6OeiFOdKr0w_4HP8aYdv0fDz98ht9ZCAA |
| linkProvider | ProQuest |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9NAEB5VKRJw4P0wFDAIJC5W7fXamz0g1FKqRG2iSBSpcDFr726IRO2SOKD8KX4jM36RoIpbD1wiyztyJvHneax3vw_gZUAqb0pHXl9H3ONMGy-1hnkhlza0fpyZ2FZiE2I87p-eyskW_Gr3wtCyyjYmVoFaFxnNke8yHnKOnXoQvD3_7pFqFL1dbSU0algcmdVPbNkWb4YHeH9fMXb4_uTdwGtUBbwMW8GSnJLEMyWDVBnFTKyFtSI2Fk9ay3Wm_FRmQqVc-2kqjVA6S4XGRG59jN00AYohfxudifo92J4MR5NPXYsXYsdX8xeFofR3F4wYrDgJNa9lvUoc4KKK9u-FmWuZ7vDm__Yf3YIbTU3t7tUPwW3YMvkduL7GtHgXPh-oUrl7y-lZs9cqd7Fad0dFiZ_DMyLyWLkfZlO6TqUTSiuoart9zPPaxQPlDla0v80lQhO0G9cr6O_Bx0v5cfehlxe5eQhuZDNruA6NSrFMihUW06lVGlvMKM4iYRx43d72JGs410n641uCvRchJOkQ4sCLzvS8Jhq5yGifsNMZEDd4daKYT5Mm1CRWKCFNLDWLERBxhB4LxfomkpLbUCkHdlr0JE3AWiR_oOPA824YQw29P1K5KZaNDSZPJhx4UAO18yRkgnhecURsQHjD1c2RfPa1ojPHhha_Wjz6t1vP4OrgZHScHA_HR4_hGqNpD2ImDnagV86X5glcyX6Us8X8afNMuvDlsiH-GxlBesc |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9NAEF5VLUJw4A01FFgQSFys2Ou1N3tAqCVEjUqjSIBUuJj1PkKk1i6JA8pf49cx4xcJqrj1wMWyvCNrbH87j_XsN4S8CLHLmzKx3zcx9zkz1s-cZX7EpYtckGibuKrZhBiP-ycncrJFfrV7YbCssrWJlaE2hcY18h7jEeeQqYdhzzVlEZPB8M35dx87SOGf1radRg2RI7v6Cenb4vVoAN_6JWPDdx_fHvpNhwFfQ1pYooISOadkmCmrmE2McE4k1sFF57jRKsikFirjJsgyaYUyOhMGnLoLwI7jYiiY_x0IyTnMsZ3J6HjyuUv3Isj-ai6jKJJBb8GQzYpj0-Y1D1g1Crgouv27SHPN6w1v_s_v6xa50cTadL-eHLfJls3vkOtrDIx3yZeBKhXdX07Pmj1YOYUonh4XJRxHZ0jwsaIfZlO8T9U_FCurarkD8P-Gwomihyvc90aR6ATkxnVl_T3y6VIe7j7Zzovc7hIaO-0sN5FVGYRPiYIgO3PKQOoZJzoW1iOvWgikuuFix5YgpynkZIiWtEOLR553ouc1AclFQgeIo04AOcOrC8V8mjYmKHVCCWkTaVgC4Ehi0Fgo1rexlNxFSnlkr0VS2hiyRfoHRh551g2DCcL_Siq3xbKRAafKhEce1KDtNImYQP5XGBEbcN5QdXMkn32raM5xVnEmHv5brafkKuA6fT8aHz0i1xiuhiBhcbhHtsv50j4mV_SPcraYP2mmJyVfLxvhvwGktIOH |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Data+Augmentation+for+Motor+Imagery+Signal+Classification+Based+on+a+Hybrid+Neural+Network&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Zhang%2C+Kai&rft.au=Xu%2C+Guanghua&rft.au=Han%2C+Zezhen&rft.au=Ma%2C+Kaiquan&rft.date=2020-08-11&rft.eissn=1424-8220&rft.volume=20&rft.issue=16&rft_id=info:doi/10.3390%2Fs20164485&rft_id=info%3Apmid%2F32796607&rft.externalDocID=32796607 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |