SAE+LSTM: A New Framework for Emotion Recognition From Multi-Channel EEG
EEG-based automatic emotion recognition can help brain-inspired robots in improving their interactions with humans. This paper presents a novel framework for emotion recognition using multi-channel electroencephalogram (EEG). The framework consists of a linear EEG mixing model and an emotion timing...
Saved in:
| Published in: | Frontiers in neurorobotics Vol. 13; p. 37 |
|---|---|
| Main Authors: | , , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Switzerland
Frontiers Research Foundation
12.06.2019
Frontiers Media S.A |
| Subjects: | |
| ISSN: | 1662-5218, 1662-5218 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | EEG-based automatic emotion recognition can help brain-inspired robots in improving their interactions with humans. This paper presents a novel framework for emotion recognition using multi-channel electroencephalogram (EEG). The framework consists of a linear EEG mixing model and an emotion timing model. Our proposed framework considerably decomposes the EEG source signals from the collected EEG signals and improves classification accuracy by using the context correlations of the EEG feature sequences. Specially, Stack AutoEncoder (SAE) is used to build and solve the linear EEG mixing model and the emotion timing model is based on the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). The framework was implemented on the DEAP dataset for an emotion recognition experiment, where the mean accuracy of emotion recognition achieved 81.10% in valence and 74.38% in arousal, and the effectiveness of our framework was verified. Our framework exhibited a better performance in emotion recognition using multi-channel EEG than the compared conventional approaches in the experiments. |
|---|---|
| AbstractList | EEG-based automatic emotion recognition can help brain-inspired robots in improving their interactions with humans. This paper presents a novel framework for emotion recognition using multi-channel electroencephalogram (EEG). The framework consists of a linear EEG mixing model and an emotion timing model. Our proposed framework considerably decomposes the EEG source signals from the collected EEG signals and improves classification accuracy by using the context correlations of the EEG feature sequences. Specially, Stack AutoEncoder (SAE) is used to build and solve the linear EEG mixing model and the emotion timing model is based on the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). The framework was implemented on the DEAP dataset for an emotion recognition experiment, where the mean accuracy of emotion recognition achieved 81.10% in valence and 74.38% in arousal, and the effectiveness of our framework was verified. Our framework exhibited a better performance in emotion recognition using multi-channel EEG than the compared conventional approaches in the experiments. EEG based automatic emotion recognition can help the brain-inspired robots in improving their interactions with human.This paper presents a novel framework for emotion recognition using multi-channel electroencephalogram (EEG). The framework consists of a linear EEG mixing model and an emotion timing model. Our proposed framework considerably decomposes the EEG source signals from the collected EEG signals and improves the classification accuracy by using the context correlations of the EEG feature sequences. Specially, Stack AutoEncoder (SAE) is used to build and solve the linear EEG mixing model and the emotion timing model is based on the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). The framework was implemented on the DEAP dataset for emotion recognition experiment, where the mean accuracy of emotion recognition achieved 81.10\% in valence and 74.38\% in arousal, and the effectiveness of our framework was verified. Our framework exhibited a better performance in emotion recognition using multi-channel EEG than the compared conventional approaches in the experiments. EEG-based automatic emotion recognition can help brain-inspired robots in improving their interactions with humans. This paper presents a novel framework for emotion recognition using multi-channel electroencephalogram (EEG). The framework consists of a linear EEG mixing model and an emotion timing model. Our proposed framework considerably decomposes the EEG source signals from the collected EEG signals and improves classification accuracy by using the context correlations of the EEG feature sequences. Specially, Stack AutoEncoder (SAE) is used to build and solve the linear EEG mixing model and the emotion timing model is based on the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). The framework was implemented on the DEAP dataset for an emotion recognition experiment, where the mean accuracy of emotion recognition achieved 81.10% in valence and 74.38% in arousal, and the effectiveness of our framework was verified. Our framework exhibited a better performance in emotion recognition using multi-channel EEG than the compared conventional approaches in the experiments.EEG-based automatic emotion recognition can help brain-inspired robots in improving their interactions with humans. This paper presents a novel framework for emotion recognition using multi-channel electroencephalogram (EEG). The framework consists of a linear EEG mixing model and an emotion timing model. Our proposed framework considerably decomposes the EEG source signals from the collected EEG signals and improves classification accuracy by using the context correlations of the EEG feature sequences. Specially, Stack AutoEncoder (SAE) is used to build and solve the linear EEG mixing model and the emotion timing model is based on the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). The framework was implemented on the DEAP dataset for an emotion recognition experiment, where the mean accuracy of emotion recognition achieved 81.10% in valence and 74.38% in arousal, and the effectiveness of our framework was verified. Our framework exhibited a better performance in emotion recognition using multi-channel EEG than the compared conventional approaches in the experiments. |
| Author | Shu, Lin Xu, Xiangmin Hu, Bin Xu, Tianyuan Li, Zhenqi Xing, Xiaofen |
| AuthorAffiliation | 1 School of Electronic and Information Engineering, South China University of Technology , Guangzhou , China 2 School of Information Science and Engineering, Lanzhou University , Lanzhou , China |
| AuthorAffiliation_xml | – name: 2 School of Information Science and Engineering, Lanzhou University , Lanzhou , China – name: 1 School of Electronic and Information Engineering, South China University of Technology , Guangzhou , China |
| Author_xml | – sequence: 1 givenname: Xiaofen surname: Xing fullname: Xing, Xiaofen – sequence: 2 givenname: Zhenqi surname: Li fullname: Li, Zhenqi – sequence: 3 givenname: Tianyuan surname: Xu fullname: Xu, Tianyuan – sequence: 4 givenname: Lin surname: Shu fullname: Shu, Lin – sequence: 5 givenname: Bin surname: Hu fullname: Hu, Bin – sequence: 6 givenname: Xiangmin surname: Xu fullname: Xu, Xiangmin |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/31244638$$D View this record in MEDLINE/PubMed |
| BookMark | eNp1kt1rFDEUxYNU7Ie--yQDvghl1iSTyYcPwrLMtoWtgq3PIZkk26wzSc3MWPrfm92tpS34lEtyzo-be88xOAgxWADeIzirKi4-u6DjOMMQiRmEsGKvwBGiFJc1RvzgSX0IjodhAyHFtOZvwGGFMCG04kfg_GrenK6uri-_FPPim70rlkn19i6mX4WLqWj6OPoYih-2jevgd_Uyxb64nLrRl4sbFYLtiqY5ewteO9UN9t3DeQJ-LpvrxXm5-n52sZivypYwxEpSG2cgrGuBiVPEtAgLDa3WtRYMWc2cM6RG3GinoHXKQK2MwpRZAgVFqjoBF3uuiWojb5PvVbqXUXm5u4hpLVUafdtZyQyh3DkmUEsJc0oImvG1cppSTSDPrK971u2ke2taG8akumfQ5y_B38h1_CPzFBGrUAZ8egCk-Huywyh7P7S261SwcRokxoRXjNWUZenHF9JNnFLIo8oqUQnOCYNZ9eFpR4-t_FtYFsC9oE1xGJJ1jxIE5TYTcpcJuc2E3GUiW-gLS-tHtV1l_pPv_m_8C923uwE |
| CitedBy_id | crossref_primary_10_1088_1361_6579_ad5bbc crossref_primary_10_3390_electronics12132933 crossref_primary_10_1007_s11571_021_09756_0 crossref_primary_10_1007_s12559_022_10016_4 crossref_primary_10_1038_s41598_025_10270_0 crossref_primary_10_1088_1741_2552_acb79e crossref_primary_10_3390_diagnostics12123188 crossref_primary_10_1007_s40747_024_01682_y crossref_primary_10_1109_ACCESS_2024_3454090 crossref_primary_10_1016_j_bspc_2023_105487 crossref_primary_10_3390_make4040053 crossref_primary_10_3390_s20164543 crossref_primary_10_1016_j_jneumeth_2024_110276 crossref_primary_10_1080_0952813X_2023_2301367 crossref_primary_10_1016_j_neucom_2024_128027 crossref_primary_10_3389_fnhum_2019_00366 crossref_primary_10_3390_s21155092 crossref_primary_10_3389_fpsyg_2024_1275142 crossref_primary_10_1109_ACCESS_2022_3201342 crossref_primary_10_3390_brainsci12080987 crossref_primary_10_1016_j_bspc_2025_108342 crossref_primary_10_1080_13803395_2024_2364403 crossref_primary_10_1016_j_neucom_2021_08_018 crossref_primary_10_3390_s20072034 crossref_primary_10_1007_s10586_024_04590_5 crossref_primary_10_1109_JBHI_2022_3198688 crossref_primary_10_1016_j_bspc_2023_104661 crossref_primary_10_1109_ACCESS_2020_3009665 crossref_primary_10_1016_j_jneumeth_2024_110317 crossref_primary_10_3389_fnins_2023_1135850 crossref_primary_10_1016_j_bspc_2024_107435 crossref_primary_10_3390_app10165662 crossref_primary_10_3390_math10173131 crossref_primary_10_1186_s40708_025_00265_y crossref_primary_10_3390_info12070272 crossref_primary_10_3389_fnins_2023_1188696 crossref_primary_10_3390_s20185083 crossref_primary_10_3390_s22145111 crossref_primary_10_1016_j_compbiomed_2023_107450 crossref_primary_10_1016_j_knosys_2023_110372 crossref_primary_10_1016_j_measurement_2020_108747 crossref_primary_10_1186_s40708_021_00133_5 crossref_primary_10_1038_s41598_022_07517_5 crossref_primary_10_1109_ACCESS_2022_3221771 crossref_primary_10_3390_s19214736 crossref_primary_10_3389_fnhum_2020_589001 crossref_primary_10_3390_s25061827 crossref_primary_10_1007_s11042_023_14354_9 crossref_primary_10_1016_j_asoc_2020_106954 crossref_primary_10_1109_JBHI_2024_3392564 crossref_primary_10_1088_1741_2552_ad7060 crossref_primary_10_1007_s11571_022_09832_z crossref_primary_10_1016_j_bspc_2024_106953 crossref_primary_10_1109_TIM_2024_3417598 crossref_primary_10_3389_fnins_2021_719869 crossref_primary_10_1145_3524499 crossref_primary_10_1016_j_compbiomed_2024_108329 crossref_primary_10_1109_ACCESS_2024_3402230 crossref_primary_10_1016_j_bspc_2023_105223 crossref_primary_10_1007_s11517_021_02452_5 crossref_primary_10_1007_s10489_020_02125_0 crossref_primary_10_1016_j_bspc_2022_103877 crossref_primary_10_1038_s41598_021_00812_7 crossref_primary_10_1109_ACCESS_2021_3049516 crossref_primary_10_3389_fncom_2021_743426 crossref_primary_10_1016_j_neucom_2025_129856 crossref_primary_10_1007_s10489_022_04228_2 crossref_primary_10_1007_s00521_021_06839_1 crossref_primary_10_1109_TCDS_2022_3149953 crossref_primary_10_1016_j_ifacol_2021_04_125 crossref_primary_10_7717_peerj_cs_2065 crossref_primary_10_1155_2021_2520394 crossref_primary_10_1088_2057_1976_ac27c4 crossref_primary_10_3390_app13116761 crossref_primary_10_1007_s42979_022_01118_9 crossref_primary_10_1016_j_bspc_2022_104060 crossref_primary_10_3390_computers11100152 crossref_primary_10_3390_computation11020020 crossref_primary_10_3390_e24091187 crossref_primary_10_1109_TCDS_2023_3245042 crossref_primary_10_1109_TIM_2022_3165280 crossref_primary_10_3390_s23010225 crossref_primary_10_1002_jnm_70000 crossref_primary_10_1109_ACCESS_2023_3270177 crossref_primary_10_1109_ACCESS_2021_3068316 crossref_primary_10_3390_electronics11040651 crossref_primary_10_1016_j_neucom_2025_130309 crossref_primary_10_1007_s11760_022_02447_1 crossref_primary_10_4018_IJSWIS_337286 crossref_primary_10_7717_peerj_cs_2610 crossref_primary_10_1109_TAFFC_2025_3554399 crossref_primary_10_1155_2024_6091523 crossref_primary_10_3389_fnins_2024_1320645 crossref_primary_10_1007_s12559_022_10053_z crossref_primary_10_1109_JSEN_2021_3121293 crossref_primary_10_3390_s22134679 crossref_primary_10_1016_j_bspc_2022_103660 crossref_primary_10_1038_s41598_024_60977_9 crossref_primary_10_1016_j_autcon_2023_104892 crossref_primary_10_1109_TSMC_2024_3523342 crossref_primary_10_3390_brainsci14030245 crossref_primary_10_1063_5_0258358 crossref_primary_10_1007_s11571_024_10149_2 crossref_primary_10_1109_ACCESS_2024_3384303 crossref_primary_10_3389_fnhum_2023_1280241 crossref_primary_10_3233_JIFS_237884 crossref_primary_10_3390_s19245533 crossref_primary_10_1016_j_jneumeth_2022_109642 crossref_primary_10_1109_TNNLS_2023_3319315 crossref_primary_10_1145_3712259 crossref_primary_10_1007_s11042_022_13149_8 crossref_primary_10_3390_s20247103 crossref_primary_10_1109_TAFFC_2020_3015018 crossref_primary_10_1109_TIM_2020_3006611 crossref_primary_10_3389_fnins_2022_872311 crossref_primary_10_1109_TAFFC_2025_3551330 crossref_primary_10_1155_2022_6000989 crossref_primary_10_1155_2022_1343358 crossref_primary_10_1016_j_cmpb_2022_106646 crossref_primary_10_3390_s21051589 crossref_primary_10_3390_buildings15183339 crossref_primary_10_1109_TAFFC_2020_2982143 crossref_primary_10_3390_math13010087 crossref_primary_10_1109_TIM_2021_3124056 crossref_primary_10_3390_app14020726 crossref_primary_10_3390_s20123491 crossref_primary_10_1007_s11042_023_16941_2 crossref_primary_10_3390_s22082976 crossref_primary_10_1080_10255842_2023_2252952 crossref_primary_10_1007_s00371_021_02069_7 crossref_primary_10_3389_fnins_2024_1479570 crossref_primary_10_1016_j_bspc_2022_103544 crossref_primary_10_1016_j_asoc_2025_113478 crossref_primary_10_1109_TNNLS_2022_3145365 crossref_primary_10_3389_fnhum_2023_1174104 crossref_primary_10_1016_j_heliyon_2024_e30174 crossref_primary_10_1016_j_jneumeth_2023_109879 crossref_primary_10_1016_j_inffus_2023_102019 crossref_primary_10_1007_s13755_023_00224_z crossref_primary_10_3390_s23031255 crossref_primary_10_1016_j_neunet_2024_106742 crossref_primary_10_1007_s10489_022_04366_7 crossref_primary_10_1016_j_bspc_2023_105422 crossref_primary_10_1109_TNSRE_2023_3320693 crossref_primary_10_1007_s00521_022_07540_7 crossref_primary_10_1007_s00773_025_01083_x crossref_primary_10_7717_peerj_cs_1977 crossref_primary_10_1016_j_engappai_2024_108011 crossref_primary_10_1007_s11517_022_02686_x crossref_primary_10_3389_fncom_2024_1416494 crossref_primary_10_3390_s23031622 crossref_primary_10_3390_math13071072 crossref_primary_10_1109_ACCESS_2024_3417525 crossref_primary_10_1088_1741_2552_ac49a7 crossref_primary_10_3390_app10217677 crossref_primary_10_1155_2021_8896062 crossref_primary_10_1016_j_compbiomed_2022_105519 crossref_primary_10_1109_JSEN_2020_3020828 crossref_primary_10_1109_JSEN_2020_3031163 crossref_primary_10_1016_j_bspc_2023_104835 crossref_primary_10_1109_TAFFC_2021_3064940 crossref_primary_10_3390_diagnostics12102508 crossref_primary_10_3389_fnhum_2024_1471634 crossref_primary_10_3389_fphy_2020_629620 crossref_primary_10_3390_s22218198 |
| Cites_doi | 10.21437/Interspeech.2014-80 10.1016/j.neuroimage.2010.08.064 10.1109/TAFFC.2017.2712143 10.5815/ijigsp.2011.05.05 10.1109/TBME.2010.2048568 10.1016/j.procs.2016.04.062 10.1109/78.650093 10.3389/fnbot.2014.00018 10.1037/0003-066X.50.5.372 10.1016/j.patrec.2007.01.002 10.1016/j.neucom.2015.09.085 10.1109/ACCESS.2016.2628407 10.3233/THC-174836 10.1109/TNNLS.2013.2280271 10.1109/T-AFFC.2011.15 10.1109/TBME.2016.2631620 10.1007/s00521-015-2149-8 10.1146/annurev.ne.18.030195.003011 10.1109/TAFFC.2015.2436926 10.1007/978-3-319-46672-9_58 10.1126/science.1127647 10.1016/j.asoc.2015.01.007 10.3389/fnbot.2017.00019 10.1186/s12984-016-0212-z 10.1016/j.neucom.2017.03.027 10.1007/978-981-10-8530-7_30 10.3389/fnbot.2013.00019 10.1007/s11571-013-9267-8 10.1111/1469-8986.3950641 10.1109/72.279181 10.3390/s18072074 10.1109/TMM.2014.2360798 10.1016/j.conb.2004.03.015 10.1109/T-AFFC.2012.4 10.1007/s10916-017-0843-z 10.1109/EMBC.2015.7320275 |
| ContentType | Journal Article |
| Copyright | 2019. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. Copyright © 2019 Xing, Li, Xu, Shu, Hu and Xu. 2019 Xing, Li, Xu, Shu, Hu and Xu |
| Copyright_xml | – notice: 2019. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: Copyright © 2019 Xing, Li, Xu, Shu, Hu and Xu. 2019 Xing, Li, Xu, Shu, Hu and Xu |
| DBID | AAYXX CITATION NPM 3V. 7XB 88I 8FE 8FH 8FK ABUWG AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO GNUQQ HCIFZ LK8 M2P M7P PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM DOA |
| DOI | 10.3389/fnbot.2019.00037 |
| DatabaseName | CrossRef PubMed ProQuest Central (Corporate) ProQuest Central (purchase pre-March 2016) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Collection ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One ProQuest Central ProQuest Central Student SciTech Premium Collection Biological Sciences Science Database Biological Science Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic (retired) ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
| DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Natural Science Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences Natural Science Collection ProQuest Central Korea Biological Science Collection ProQuest Central (New) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition Biological Science Database ProQuest SciTech Collection ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
| DatabaseTitleList | PubMed Publicly Available Content Database MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: PIMPY name: Publicly Available Content Database url: http://search.proquest.com/publiccontent sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 1662-5218 |
| ExternalDocumentID | oai_doaj_org_article_7d468ff791c647fa9967ff5afb66b408 PMC6581731 31244638 10_3389_fnbot_2019_00037 |
| Genre | Journal Article |
| GeographicLocations | China |
| GeographicLocations_xml | – name: China |
| GroupedDBID | --- 29H 2WC 53G 5GY 5VS 88I 8FE 8FH 9T4 AAFWJ AAKPC AAYXX ABUWG ACGFS ADBBV ADDVE ADMLS ADRAZ AEGXH AENEX AFFHD AFKRA AFPKN ALMA_UNASSIGNED_HOLDINGS AOIJS ARCSS AZQEC BAWUL BBNVY BCNDV BENPR BHPHI BPHCQ CCPQU CITATION CS3 DIK DWQXO E3Z F5P GNUQQ GROUPED_DOAJ GX1 HCIFZ HYE KQ8 LK8 M2P M48 M7P M~E O5R O5S OK1 OVT PGMZT PHGZM PHGZT PIMPY PQGLB PQQKQ PROAC RNS RPM TR2 ACXDI C1A IAO IEA IHR IPNFZ ISR NPM RIG 3V. 7XB 8FK PKEHL PQEST PQUKI PRINS Q9U 7X8 PUEGO 5PM |
| ID | FETCH-LOGICAL-c4717-45dfd0055924fa4dc129b0ebb5b971eb7ffd4518dbfa0efad0bada267e40961a3 |
| IEDL.DBID | DOA |
| ISSN | 1662-5218 |
| IngestDate | Fri Oct 03 12:36:54 EDT 2025 Tue Nov 04 02:03:47 EST 2025 Thu Sep 04 17:47:23 EDT 2025 Fri Jul 25 11:42:06 EDT 2025 Wed Feb 19 02:32:40 EST 2025 Sat Nov 29 03:48:44 EST 2025 Tue Nov 18 22:16:58 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | Stack AutoEncoder LSTM emotion recognition neural network EEG |
| Language | English |
| License | This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c4717-45dfd0055924fa4dc129b0ebb5b971eb7ffd4518dbfa0efad0bada267e40961a3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 Reviewed by: Sung Chan Jun, Gwangju Institute of Science and Technology, South Korea; Oluwarotimi WIlliams Samuel, Shenzhen Institutes of Advanced Technology (CAS), China Edited by: Jan Babic, Jožef Stefan Institute (IJS), Slovenia |
| OpenAccessLink | https://doaj.org/article/7d468ff791c647fa9967ff5afb66b408 |
| PMID | 31244638 |
| PQID | 2293988470 |
| PQPubID | 4424403 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_7d468ff791c647fa9967ff5afb66b408 pubmedcentral_primary_oai_pubmedcentral_nih_gov_6581731 proquest_miscellaneous_2248377567 proquest_journals_2293988470 pubmed_primary_31244638 crossref_primary_10_3389_fnbot_2019_00037 crossref_citationtrail_10_3389_fnbot_2019_00037 |
| PublicationCentury | 2000 |
| PublicationDate | 20190612 |
| PublicationDateYYYYMMDD | 2019-06-12 |
| PublicationDate_xml | – month: 6 year: 2019 text: 20190612 day: 12 |
| PublicationDecade | 2010 |
| PublicationPlace | Switzerland |
| PublicationPlace_xml | – name: Switzerland – name: Lausanne |
| PublicationTitle | Frontiers in neurorobotics |
| PublicationTitleAlternate | Front Neurorobot |
| PublicationYear | 2019 |
| Publisher | Frontiers Research Foundation Frontiers Media S.A |
| Publisher_xml | – name: Frontiers Research Foundation – name: Frontiers Media S.A |
| References | Li (B18) 2018; 26 Whitten (B40) 2011; 54 Gupta (B9) 2016; 174 Lang (B17) 1995; 50 Soleymani (B37) 2016; 7 Keil (B12) 2002; 39 Bonita (B4) 2014; 8 You (B43) 2004; 20 Korats (B15) 2012 Mohammadi (B26) 2017; 28 Arnau-Gonzlez (B2) 2017; 244 Singer (B36) 1995; 18 Zheng (B45) 2016; 4 Chen (B8) 2015 Liu (B23) 2016 Reddi (B30) 2018 Majumdar (B24) 2016; 99 Chen (B7) Schuster (B34) 1997; 45 Yoo (B42) 2014; 8 Koelstra (B14) 2012; 3 Khosrowabadi (B13) 2014; 25 Sanei (B33) 2013 Li (B19); 14 Yin (B41) 2017; 11 Samuel (B32) 2017; 41 Niedermeyer (B27) 2005 Urgen (B39) 2013; 7 Li (B21) 2018 Hinton (B10) 2006; 313 Zhang (B44) 2016; 99 Hosseini (B11) 2011; 3 Lin (B22) 2010; 57 Brunner (B5) 2007; 28 Sak (B31) 2014 Shu (B35) 2018; 18 Mao (B25) 2014; 16 Alzoubi (B1) 2012; 3 Chen (B6); 30 Phelps (B29) 2004; 14 Li (B20) Orgo (B28) 2015 Soleymani (B38) 2014 Bengio (B3) 2002; 5 Kumar (B16) 2016; 84 |
| References_xml | – volume-title: Fifteenth Annual Conference of the International Speech Communication Association year: 2014 ident: B31 article-title: “Long short-term memory recurrent neural network architectures for large scale acoustic modeling,” doi: 10.21437/Interspeech.2014-80 – volume: 54 start-page: 860 year: 2011 ident: B40 article-title: A better oscillation detection method robustly extracts eeg rhythms across brain state changes: the human alpha rhythm as a test case publication-title: Neuroimage doi: 10.1016/j.neuroimage.2010.08.064 – volume: 4 start-page: 8375 year: 2016 ident: B45 article-title: Identifying stable patterns over time for emotion recognition from eeg publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2017.2712143 – start-page: 269 volume-title: International Joint Conference on Biomedical Engineering Systems and Technologies year: 2012 ident: B15 article-title: “Applying ica in eeg: choice of the window length and of the decorrelation method,” – volume: 3 start-page: 30 year: 2011 ident: B11 article-title: Emotion recognition method using entropy analysis of eeg signals publication-title: Int. J. Image Graph. Signal Process. doi: 10.5815/ijigsp.2011.05.05 – volume: 57 start-page: 1798 year: 2010 ident: B22 article-title: Eeg-based emotion recognition in music listening publication-title: IEEE Trans. BioMed. Eng. doi: 10.1109/TBME.2010.2048568 – volume: 84 start-page: 31 year: 2016 ident: B16 article-title: Bispectral analysis of eeg for emotion recognition publication-title: Proced. Comput. Sci. doi: 10.1016/j.procs.2016.04.062 – start-page: 352 volume-title: IEEE International Conference on Bioinformatics and Biomedicine ident: B20 article-title: “Emotion recognition from multi-channel eeg data through convolutional recurrent neural network,” – volume: 45 start-page: 2673 year: 1997 ident: B34 article-title: Bidirectional recurrent neural networks publication-title: IEEE Trans. Signal Process. doi: 10.1109/78.650093 – volume-title: International Conference on Learning Representations year: 2018 ident: B30 article-title: “On the convergence of adam and beyond,” – volume: 8 start-page: 18 year: 2014 ident: B42 article-title: Predictable internal brain dynamics in eeg and its relation to conscious states publication-title: Front. Neurorobot. doi: 10.3389/fnbot.2014.00018 – volume: 50 start-page: 372 year: 1995 ident: B17 article-title: The emotion probe. Studies of motivation and attention publication-title: Am. Psychol. doi: 10.1037/0003-066X.50.5.372 – volume: 20 start-page: 77 year: 2004 ident: B43 article-title: Blind signal separation of multi-channel eeg publication-title: Acta Biophys. Sinica – volume: 28 start-page: 957 year: 2007 ident: B5 article-title: Spatial filtering and selection of optimized components in four class motor imagery eeg data using independent components analysis publication-title: Patt. Recogn. Lett. doi: 10.1016/j.patrec.2007.01.002 – volume: 174 start-page: 875 year: 2016 ident: B9 article-title: Relevance vector classifier decision fusion and eeg graph-theoretic features for automatic affective state characterization publication-title: Neurocomputing doi: 10.1016/j.neucom.2015.09.085 – volume: 99 start-page: 1 year: 2016 ident: B44 article-title: Facial emotion recognition based on biorthogonal wavelet entropy, fuzzy support vector machine, and stratified cross validation publication-title: IEEE Access doi: 10.1109/ACCESS.2016.2628407 – volume: 26 start-page: 509 year: 2018 ident: B18 article-title: Emotion recognition from multichannel eeg signals using k-nearest neighbor classification publication-title: Tech. Health Care doi: 10.3233/THC-174836 – volume-title: EEG Signal Processing year: 2013 ident: B33 – volume: 25 start-page: 609 year: 2014 ident: B13 article-title: Ernn: a biologically inspired feedforward neural network to discriminate emotion from eeg signal publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2013.2280271 – start-page: 1 volume-title: IEEE International Conference on Multimedia and Expo year: 2014 ident: B38 article-title: “Continuous emotion detection using eeg signals and facial expressions,” – volume: 3 start-page: 18 year: 2012 ident: B14 article-title: Deap: a database for emotion analysis ;using physiological signals publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/T-AFFC.2011.15 – volume: 99 start-page: 1 year: 2016 ident: B24 article-title: Semi-supervised stacked label consistent autoencoder for reconstruction and analysis of biomedical signals publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2016.2631620 – volume: 28 start-page: 1985 year: 2017 ident: B26 article-title: Wavelet-based emotion recognition system using eeg signal publication-title: Neural Comput. Appl. doi: 10.1007/s00521-015-2149-8 – volume: 18 start-page: 555 year: 1995 ident: B36 article-title: Visual feature integration and the temporal correlation hypothesis publication-title: Ann. Rev. Neurosci. doi: 10.1146/annurev.ne.18.030195.003011 – volume: 7 start-page: 17 year: 2016 ident: B37 article-title: Analysis of eeg signals and facial expressions for continuous emotion detection publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2015.2436926 – start-page: 521 volume-title: International Conference on Neural Information Processing year: 2016 ident: B23 article-title: Emotion recognition using multimodal deep learning. in doi: 10.1007/978-3-319-46672-9_58 – volume: 313 start-page: 504 year: 2006 ident: B10 article-title: Reducing the dimensionality of data with neural networks publication-title: Science doi: 10.1126/science.1127647 – volume: 30 start-page: 663 ident: B6 article-title: Electroencephalogram-based emotion assessment system using ontology and data mining techniques publication-title: Appl. Soft Comput. doi: 10.1016/j.asoc.2015.01.007 – volume: 11 start-page: 19 year: 2017 ident: B41 article-title: Cross-subject eeg feature selection for emotion recognition using transfer recursive feature elimination publication-title: Front. Neurorobot. doi: 10.3389/fnbot.2017.00019 – start-page: 395 volume-title: IEEE International Conference on Bioinformatics and Biomedicine ident: B7 article-title: “Feature-level fusion of multimodal physiological signals for emotion recognition,” – volume: 14 start-page: 2 ident: B19 article-title: A motion-classification strategy based on semg-eeg signal combination for upper-limb amputees publication-title: J. Neuroeng. Rehabilit. doi: 10.1186/s12984-016-0212-z – start-page: 63 volume-title: International Conference on Affective Computing and Intelligent Interaction year: 2015 ident: B8 article-title: “Identifying valence and arousal levels via connectivity between eeg channels,” – volume: 244 start-page: 81 year: 2017 ident: B2 article-title: Fusing highly dimensional energy and connectivity features to identify affective states from eeg signals publication-title: Neurocomputing doi: 10.1016/j.neucom.2017.03.027 – volume-title: Internet Multimedia Computing and Service. ICIMCS 2017 year: 2018 ident: B21 article-title: “Emotion Recognition from EEG Using RASM and LSTM,” doi: 10.1007/978-981-10-8530-7_30 – volume-title: Electroencephalography: Basic Principles, Clinical Applications, and Related Fields year: 2005 ident: B27 – volume: 7 start-page: 19 year: 2013 ident: B39 article-title: Eeg theta and mu oscillations during perception of human and robot actions publication-title: Front. Neurorobot. doi: 10.3389/fnbot.2013.00019 – volume: 8 start-page: 1 year: 2014 ident: B4 article-title: Time domain measures of inter-channel eeg correlations: a comparison of linear, nonparametric and nonlinear measures publication-title: Cogn. Neurodyn. doi: 10.1007/s11571-013-9267-8 – volume: 39 start-page: 641 year: 2002 ident: B12 article-title: Large-scale neural correlates of affective picture processing publication-title: Psychophysiology doi: 10.1111/1469-8986.3950641 – volume: 5 start-page: 157 year: 2002 ident: B3 article-title: Learning long-term dependencies with gradient descent is difficult publication-title: IEEE Trans. Neural Netw. doi: 10.1109/72.279181 – volume: 18 start-page: 2074 year: 2018 ident: B35 article-title: A review of emotion recognition using physiological signals publication-title: Sensors doi: 10.3390/s18072074 – volume: 16 start-page: 2203 year: 2014 ident: B25 article-title: Learning salient features for speech emotion recognition using convolutional neural networks publication-title: IEEE Transact. Multi. doi: 10.1109/TMM.2014.2360798 – volume: 14 start-page: 198 year: 2004 ident: B29 article-title: Human emotion and memory: interactions of the amygdala and hippocampal complex publication-title: Curr. Opin. Neurobiol. doi: 10.1016/j.conb.2004.03.015 – volume: 3 start-page: 298 year: 2012 ident: B1 article-title: Detecting naturalistic expressions of nonbasic affect using physiological signals publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/T-AFFC.2012.4 – volume: 41 start-page: 194 year: 2017 ident: B32 article-title: Towards efficient decoding of multiple classes of motor imagery limb movements based on eeg spectral and time domain descriptors publication-title: J. Med. Syst. doi: 10.1007/s10916-017-0843-z – start-page: 8107 volume-title: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) year: 2015 ident: B28 article-title: “Effect of negative and positive emotions on eeg spectral asymmetry,” doi: 10.1109/EMBC.2015.7320275 |
| SSID | ssj0062658 |
| Score | 2.5751042 |
| Snippet | EEG-based automatic emotion recognition can help brain-inspired robots in improving their interactions with humans. This paper presents a novel framework for... EEG based automatic emotion recognition can help the brain-inspired robots in improving their interactions with human.This paper presents a novel framework for... |
| SourceID | doaj pubmedcentral proquest pubmed crossref |
| SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
| StartPage | 37 |
| SubjectTerms | Algorithms Arousal Asymmetry Brain research EEG emotion recognition Emotions International conferences Long short-term memory LSTM Methods neural network Neural networks Noise Physiology Robotics and AI Signal processing Stack AutoEncoder Wavelet transforms |
| SummonAdditionalLinks | – databaseName: Science Database dbid: M2P link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Lb9QwELagcIAD70egICNxQShaO3HsmAtaUJYeaFXRIvVm2bENlUpSdlt-PzNOsuwi1Au3KHGkiefhGXvyfYS8hqRDlryqc3A0nYtCVbmFQiT3lleRaeYFaxPZhDo4qE9O9OG44bYa2yqnmJgCte9b3COfFbAu6RpiKXt__jNH1ig8XR0pNK6TG5DZcGzp2i8Op0gMuXpVD0eTUIjpWexcj-2THCEqGTKfbyxFCbH_X2nm392SG8vP4u7_Cn6P3BkTTzofLOU-uRa6B-T2BhzhQ7J3NG_efj463n9H5xSiH11MnVsUUlvaDIw_9MvUcwTXi2X_g6afeHP8T6ELZ7RpPj0iXxfN8ce9fGRayFtYnBD43EePcFxQjUUrfAtZgGPBucppxYNTMXoB8-tdtCxE65mz3hZSBYGUMbZ8THa6vgtPCS1cESsZhYe3hbDCasF1sJy1pYTirMjIbJp0044w5MiGcWagHEE1maQmg2oySU0ZebN-43yA4Lhi7AfU43ocgmenG_3ymxl90SgvZB2j0ryVQkUL5gofWNnopHSC1RnZnTRpRo9emT9qzMir9WPwRTxgsV3oL3EM4vOrSoIcTwajWUtSYiIFwS4jasuctkTdftKdfk9432DFXJX82dViPSe3cB7yxKu0S3YulpfhBbnZ_ro4XS1fJsf4DWCOFx0 priority: 102 providerName: ProQuest |
| Title | SAE+LSTM: A New Framework for Emotion Recognition From Multi-Channel EEG |
| URI | https://www.ncbi.nlm.nih.gov/pubmed/31244638 https://www.proquest.com/docview/2293988470 https://www.proquest.com/docview/2248377567 https://pubmed.ncbi.nlm.nih.gov/PMC6581731 https://doaj.org/article/7d468ff791c647fa9967ff5afb66b408 |
| Volume | 13 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 1662-5218 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0062658 issn: 1662-5218 databaseCode: DOA dateStart: 20070101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 1662-5218 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0062658 issn: 1662-5218 databaseCode: M~E dateStart: 20070101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVPQU databaseName: Biological Science Database customDbUrl: eissn: 1662-5218 dateEnd: 20211231 omitProxy: false ssIdentifier: ssj0062658 issn: 1662-5218 databaseCode: M7P dateStart: 20071102 isFulltext: true titleUrlDefault: http://search.proquest.com/biologicalscijournals providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: eissn: 1662-5218 dateEnd: 20211231 omitProxy: false ssIdentifier: ssj0062658 issn: 1662-5218 databaseCode: BENPR dateStart: 20071102 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: Publicly Available Content Database customDbUrl: eissn: 1662-5218 dateEnd: 20211231 omitProxy: false ssIdentifier: ssj0062658 issn: 1662-5218 databaseCode: PIMPY dateStart: 20071102 isFulltext: true titleUrlDefault: http://search.proquest.com/publiccontent providerName: ProQuest – providerCode: PRVPQU databaseName: Science Database customDbUrl: eissn: 1662-5218 dateEnd: 20211231 omitProxy: false ssIdentifier: ssj0062658 issn: 1662-5218 databaseCode: M2P dateStart: 20071102 isFulltext: true titleUrlDefault: https://search.proquest.com/sciencejournals providerName: ProQuest |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3di9QwEA96-qAP4rfVc4ngi0jZpE2Txrc96XqCu5S7E9ankjQJHpxd2bvz0b_dmbRddkX0xZdQmqRMJzOZGTL5DSGvwemQOS_KFBRNpyJTRWogEEmd4UVgmjnB2lhsQi2X5Wql651SX5gT1sMD94ybKidkGYLSvJVCBQPfViEUJlgpreiv-TKlx2Cq34PBSy_K_lASQjA9DZ1dY-IkR3BKhjXPd4xQxOr_k4P5e57kjuGZ3yf3Bo-RznpKH5AbvntI7u7gCD4ix6ez6u2n07PFOzqjsG3R-ZhyRcEnpVVfqoeejMlC8DzfrL_RePs2xQsGnb-gVfXhMfk8r87eH6dDiYS0BauCiOUuOMTRgjAqGOFaMN-WeWsLqxX3FljlRMFLZ4NhPhjHrHEmk8oLrPVi8ifkoFt3_hmhmc1CIYNwMFsII4wWXHvDWZtLiKqyhExHnjXtgB-OZSwuGogjkMtN5HKDXG4ilxPyZjvje4-d8ZexR7gM23GIeh1fgCw0gyw0_5KFhByOi9gMqnjZZODQ6BKMMEvIq203KBGejJjOr69xDALrq0ICHU_7Nd9SkqMHBLtUQtSeNOyRut_TnX-NQN0ghFzl_Pn_-LcX5A5yK41lkw7JwdXm2r8kt9sfV-eXmwm5qVblhNw6qpb1ySTqArSLrMZWxfZnBf31x0X95ReVHRCi |
| linkProvider | Directory of Open Access Journals |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9QwEB6VggQceD8CBYwEB4SidRInTpAQWmCXrbpdVXQr9Rbs2IZKJSm7Lah_qr-RmTyWXYR664FblDiRnXzzeSYezwfwAp2OJAri1EdDy3wRythXGIj4RgWx4xk3ghe12IScTNL9_WxnDc66vTCUVtlxYk3UpiroH3kvxHkpS5FL-bujHz6pRtHqaieh0cBiy57-wpBt_nbzI37fl2E4HEw_jPxWVcAvkIipyLdxhkpPYeThlDAFzniaW61jncnAaumcEXGQGu0Ut04ZrpVRYSKtIHkUFeFzL8FlQZXFKFUw3OmYH2ODOG2WQjHwy3qu1BWlawZUEpOT0vrS1FcrBPzLrf07O3Npuhve_N9e1C240TrWrN9Ywm1Ys-UduL5UbvEujHb7g9fj3en2G9ZnyO5s2GWmMXTd2aBRNGKfu5wqPB7Oqu-s3qTs0z6M0h6yweDTPdi7kKHch_WyKu1DYKEOXZw4YfBuIZRQmQgyqwJeRAkGn6EHve4j50VbZp3UPg5zDLcIFnkNi5xgkdew8ODV4o6jpsTIOW3fE24W7ag4eH2imn3NW67JpRFJ6pzMgiIR0ik0RxxgrJxOEi146sFGh5y8Zax5_gc2HjxfXEauoQUkVdrqhNqQ_oCME-zHgwaki55E5CgimXsgV-C70tXVK-XBt7qeOVpNIKPg0fndegZXR9PtcT7enGw9hmv0TvxaQ2oD1o9nJ_YJXCl-Hh_MZ09ro2Tw5aLB_Ru9r3au |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9NAEB6VFCE48H4YCiwSHBCy4sfaayMhFKhNo7ZRRItUTmbXuwuVil2SFsRf49cx40dIEOqtB25WvLbW9jffzGRn5wN4ikFHHPpR4qKhpS4PRORKTERcLf3IeqmnuVc2YhNiMkkODtLpGvzq98JQWWXPiQ1R67qk_8iHAfqlNEEu9Ya2K4uYbuavj7-5pCBFK629nEYLkW3z8wemb_NX40381s-CIM_23265ncKAWyIpU8NvbTW1ocIsxEquS_R-yjNKRSoVvlHCWs0jP9HKSs9YqT0ltQxiYThJpcgQ73sB1jEk58EA1qfj3enH3g9gphAl7cIopoHp0FaqpuJNnxpkeqS7vuQIG72AfwW5f9dqLjm__Nr__Nquw9Uu5Gaj1kZuwJqpbsKVpUaMt2Brb5S92Nnb333JRgx5n-V9zRrDoJ5lrdYRe99XW-FxPqu_smb7sks7NCpzxLLs3W34cC6PcgcGVV2Ze8ACFdgotlzj1ZxLLlPup0b6XhnGmJYGDgz7D16UXQN20gE5KjARI4gUDUQKgkjRQMSB54srjtvmI2eMfUMYWoyjtuHND_Xsc9GxUCE0jxNrReqXMRdWoqHiA0bSqjhW3Esc2OhRVHRcNi_-QMiBJ4vTyEK0tCQrU5_SGFImEFGM87jbAnYxk5BCSKR5B8QKlFemunqmOvzSdDpHC_JF6N8_e1qP4RJiutgZT7YfwGV6JW4jLrUBg5PZqXkIF8vvJ4fz2aPOQhl8Om90_wbrSoD3 |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SAE%2BLSTM%3A+A+New+Framework+for+Emotion+Recognition+From+Multi-Channel+EEG&rft.jtitle=Frontiers+in+neurorobotics&rft.au=Xing%2C+Xiaofen&rft.au=Li%2C+Zhenqi&rft.au=Xu%2C+Tianyuan&rft.au=Shu%2C+Lin&rft.date=2019-06-12&rft.pub=Frontiers+Media+S.A&rft.eissn=1662-5218&rft.volume=13&rft_id=info:doi/10.3389%2Ffnbot.2019.00037&rft_id=info%3Apmid%2F31244638&rft.externalDocID=PMC6581731 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1662-5218&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1662-5218&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1662-5218&client=summon |