An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals

Most previous EEG-based emotion recognition methods studied hand-crafted EEG features extracted from different electrodes. In this article, we study the relation among different EEG electrodes and propose a deep learning method to automatically extract the spatial features that characterize the func...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on affective computing Ročník 13; číslo 3; s. 1528 - 1540
Hlavní autoři: Du, Xiaobing, Ma, Cuixia, Zhang, Guanhua, Li, Jinyao, Lai, Yu-Kun, Zhao, Guozhen, Deng, Xiaoming, Liu, Yong-Jin, Wang, Hongan
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 01.07.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1949-3045, 1949-3045
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Most previous EEG-based emotion recognition methods studied hand-crafted EEG features extracted from different electrodes. In this article, we study the relation among different EEG electrodes and propose a deep learning method to automatically extract the spatial features that characterize the functional relation between EEG signals at different electrodes. Our proposed deep model is called AT tention-based LSTM with D omain D iscriminator (ATDD-LSTM), a model based on Long Short-Term Memory (LSTM) for emotion recognition that can characterize nonlinear relations among EEG signals of different electrodes. To achieve state-of-the-art emotion recognition performance, the architecture of ATDD-LSTM has two distinguishing characteristics: (1) By applying the attention mechanism to the feature vectors produced by LSTM, ATDD-LSTM automatically selects suitable EEG channels for emotion recognition, which makes the learned model concentrate on the emotion related channels in response to a given emotion; (2) To minimize the significant feature distribution shift between different sessions and/or subjects, ATDD-LSTM uses a domain discriminator to modify the data representation space and generate domain-invariant features. We evaluate the proposed ATDD-LSTM model on three public EEG emotional databases (DEAP, SEED and CMEED) for emotion recognition. The experimental results demonstrate that our ATDD-LSTM model achieves superior performance on subject-dependent (for the same subject), subject-independent (for different subjects) and cross-session (for the same subject) evaluation.
AbstractList Most previous EEG-based emotion recognition methods studied hand-crafted EEG features extracted from different electrodes. In this article, we study the relation among different EEG electrodes and propose a deep learning method to automatically extract the spatial features that characterize the functional relation between EEG signals at different electrodes. Our proposed deep model is called AT tention-based LSTM with D omain D iscriminator (ATDD-LSTM), a model based on Long Short-Term Memory (LSTM) for emotion recognition that can characterize nonlinear relations among EEG signals of different electrodes. To achieve state-of-the-art emotion recognition performance, the architecture of ATDD-LSTM has two distinguishing characteristics: (1) By applying the attention mechanism to the feature vectors produced by LSTM, ATDD-LSTM automatically selects suitable EEG channels for emotion recognition, which makes the learned model concentrate on the emotion related channels in response to a given emotion; (2) To minimize the significant feature distribution shift between different sessions and/or subjects, ATDD-LSTM uses a domain discriminator to modify the data representation space and generate domain-invariant features. We evaluate the proposed ATDD-LSTM model on three public EEG emotional databases (DEAP, SEED and CMEED) for emotion recognition. The experimental results demonstrate that our ATDD-LSTM model achieves superior performance on subject-dependent (for the same subject), subject-independent (for different subjects) and cross-session (for the same subject) evaluation.
Author Zhang, Guanhua
Lai, Yu-Kun
Wang, Hongan
Zhao, Guozhen
Li, Jinyao
Liu, Yong-Jin
Ma, Cuixia
Du, Xiaobing
Deng, Xiaoming
Author_xml – sequence: 1
  givenname: Xiaobing
  surname: Du
  fullname: Du, Xiaobing
  email: duxiaobing16@mails.ucas.ac.cn
  organization: Beijing Key Laboratory of Human Computer Interactions, Institute of Software, Chinese Academy of Sciences, Beijing, China
– sequence: 2
  givenname: Cuixia
  orcidid: 0000-0003-3999-7429
  surname: Ma
  fullname: Ma, Cuixia
  email: cuixia@iscas.ac.cn
  organization: State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences, Beijing, China
– sequence: 3
  givenname: Guanhua
  surname: Zhang
  fullname: Zhang, Guanhua
  email: zgh17@mails.tsinghua.edu.cn
  organization: Department of Computer Science and Technology, BNRist, MOE-Key Laboratory of Pervasive Computing, Tsinghua University, Beijing, China
– sequence: 4
  givenname: Jinyao
  surname: Li
  fullname: Li, Jinyao
  email: lijinyao19@mails.ucas.ac.cn
  organization: Beijing Key Laboratory of Human Computer Interactions, Institute of Software, Chinese Academy of Sciences, Beijing, China
– sequence: 5
  givenname: Yu-Kun
  orcidid: 0000-0002-2094-5680
  surname: Lai
  fullname: Lai, Yu-Kun
  email: laiy4@cardiff.ac.uk
  organization: School of Computer Science and Informatics, Cardiff University, Cardiff, Wales, U.K
– sequence: 6
  givenname: Guozhen
  orcidid: 0000-0003-4438-5320
  surname: Zhao
  fullname: Zhao, Guozhen
  email: zhaogz@psych.ac.cn
  organization: CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
– sequence: 7
  givenname: Xiaoming
  surname: Deng
  fullname: Deng, Xiaoming
  email: xiaoming@iscas.ac.cn
  organization: Beijing Key Laboratory of Human Computer Interactions, Institute of Software, Chinese Academy of Sciences, Beijing, China
– sequence: 8
  givenname: Yong-Jin
  orcidid: 0000-0001-5774-1916
  surname: Liu
  fullname: Liu, Yong-Jin
  email: liuyongjin@tsinghua.edu.cn
  organization: Department of Computer Science and Technology, BNRist, MOE-Key Laboratory of Pervasive Computing, Tsinghua University, Beijing, China
– sequence: 9
  givenname: Hongan
  surname: Wang
  fullname: Wang, Hongan
  email: hongan@iscas.ac.cn
  organization: State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences, Beijing, China
BookMark eNp9kF9PwjAUxRuDiYh8AX1p4vOwf9ZtfSRkoAlIIvjcdF2HxdFiV2L89m5AjPHB-3LOw_nd3HuuQc86qwG4xWiEMeIP6_F0OhkRRNCIIkxTjC9AH_OYRxTFrPfLX4Fh02xRO5TShKR9sBxbmFeVUUbbAOer9QI-6_Dp_DusnIf5zgXjLHzRym2sOfqpdzu4ONTBqDdpra5hns_gymysrJsbcFm1oodnHYDXab6ePEbz5expMp5HilIeIs7KQqZFyUipJNGMKlZkcUniNNEswzxTWcyZUhhRnFCuWYXLLC0kSQhXKEvpANyf9u69-zjoJoitO_juAkFSxDFLWMLbVHZKKe-axutKKBNk90Xw0tQCI9E1KI4Niq5BcW6wRckfdO_NTvqv_6G7E2S01j9Ae03MWEq_AWGpfCs
CODEN ITACBQ
CitedBy_id crossref_primary_10_1088_1741_2552_aced22
crossref_primary_10_1007_s12293_024_00434_2
crossref_primary_10_1109_JBHI_2021_3049119
crossref_primary_10_1109_TAFFC_2023_3261867
crossref_primary_10_1109_TCE_2025_3533002
crossref_primary_10_1007_s11571_024_10162_5
crossref_primary_10_1109_TNNLS_2023_3305621
crossref_primary_10_3389_fnbot_2024_1442080
crossref_primary_10_3389_fnins_2024_1400444
crossref_primary_10_1007_s11571_025_10324_z
crossref_primary_10_1007_s13534_025_00475_7
crossref_primary_10_1016_j_bspc_2022_103612
crossref_primary_10_3389_fpsyg_2024_1326791
crossref_primary_10_1109_TNSRE_2021_3137340
crossref_primary_10_1109_TAFFC_2022_3221554
crossref_primary_10_1109_TAFFC_2024_3433470
crossref_primary_10_3390_brainsci14080817
crossref_primary_10_3389_fncom_2025_1589247
crossref_primary_10_1016_j_inffus_2025_103279
crossref_primary_10_1016_j_inffus_2023_102220
crossref_primary_10_1109_ACCESS_2024_3412328
crossref_primary_10_1109_JBHI_2022_3198688
crossref_primary_10_1007_s11042_023_16294_w
crossref_primary_10_1109_TNSRE_2024_3380595
crossref_primary_10_3389_fnins_2024_1458815
crossref_primary_10_1142_S0219519425400184
crossref_primary_10_1109_ACCESS_2024_3454082
crossref_primary_10_1007_s11760_025_04124_5
crossref_primary_10_1016_j_jneumeth_2024_110317
crossref_primary_10_1016_j_bspc_2024_107435
crossref_primary_10_3389_fpsyt_2025_1633860
crossref_primary_10_1007_s11760_024_03360_5
crossref_primary_10_1016_j_knosys_2025_113368
crossref_primary_10_1016_j_neunet_2025_107614
crossref_primary_10_1186_s40708_025_00265_y
crossref_primary_10_1109_TNSRE_2022_3173724
crossref_primary_10_1088_1361_6579_ad9661
crossref_primary_10_3389_fnhum_2024_1324897
crossref_primary_10_1016_j_compbiomed_2023_107450
crossref_primary_10_1109_JBHI_2023_3240891
crossref_primary_10_1088_2057_1976_ad0f3f
crossref_primary_10_1109_TCE_2024_3414154
crossref_primary_10_1007_s40031_024_01079_y
crossref_primary_10_1007_s11709_024_1092_0
crossref_primary_10_1016_j_neunet_2025_107457
crossref_primary_10_1109_ACCESS_2024_3436556
crossref_primary_10_26599_BDMA_2024_9020071
crossref_primary_10_1007_s00521_024_10469_8
crossref_primary_10_1016_j_bspc_2024_106276
crossref_primary_10_3389_fnins_2025_1592070
crossref_primary_10_1007_s11571_023_10034_4
crossref_primary_10_1109_TAFFC_2023_3319397
crossref_primary_10_1186_s40537_025_01177_8
crossref_primary_10_1007_s11042_023_14354_9
crossref_primary_10_1016_j_neunet_2025_107483
crossref_primary_10_1109_TNSRE_2021_3111689
crossref_primary_10_3389_fnins_2023_1247082
crossref_primary_10_1109_TAFFC_2023_3336531
crossref_primary_10_1016_j_bspc_2021_103361
crossref_primary_10_1109_JSEN_2023_3343358
crossref_primary_10_15302_J_QB_021_0267
crossref_primary_10_1109_TAFFC_2024_3480355
crossref_primary_10_1007_s42486_021_00078_y
crossref_primary_10_1088_2057_1976_acf137
crossref_primary_10_1109_TAFFC_2022_3170369
crossref_primary_10_1016_j_engappai_2025_110004
crossref_primary_10_1109_TIFS_2025_3602266
crossref_primary_10_1007_s10115_025_02354_0
crossref_primary_10_1016_j_neucom_2024_128445
crossref_primary_10_1016_j_neucom_2024_129255
crossref_primary_10_3390_ai6090215
crossref_primary_10_1016_j_neunet_2025_107800
crossref_primary_10_7717_peerj_cs_2065
crossref_primary_10_3390_s23041917
crossref_primary_10_1109_TAFFC_2022_3163609
crossref_primary_10_1109_TNNLS_2022_3168935
crossref_primary_10_3390_brainsci13091293
crossref_primary_10_1016_j_bspc_2025_107550
crossref_primary_10_1016_j_knosys_2024_112826
crossref_primary_10_1109_TAFFC_2023_3288118
crossref_primary_10_3390_math12071048
crossref_primary_10_3389_fninf_2023_1067095
crossref_primary_10_1016_j_cogsys_2023_101152
crossref_primary_10_1016_j_compbiomed_2024_108857
crossref_primary_10_1109_TIM_2023_3240230
crossref_primary_10_1016_j_asoc_2025_113659
crossref_primary_10_1088_1742_6596_1693_1_012206
crossref_primary_10_1109_TIM_2025_3533618
crossref_primary_10_1007_s11760_022_02447_1
crossref_primary_10_1016_j_knosys_2025_114115
crossref_primary_10_1109_TAFFC_2023_3334520
crossref_primary_10_1109_JBHI_2023_3242090
crossref_primary_10_3390_bioengineering11080782
crossref_primary_10_1016_j_jksuci_2023_03_019
crossref_primary_10_1109_TAFFC_2023_3288885
crossref_primary_10_1109_TIM_2025_3565702
crossref_primary_10_1186_s40708_024_00242_x
crossref_primary_10_1016_j_knosys_2024_111826
crossref_primary_10_1109_JSEN_2024_3514094
crossref_primary_10_1109_JSEN_2023_3309260
crossref_primary_10_1109_TCSS_2024_3412074
crossref_primary_10_1016_j_autcon_2023_104892
crossref_primary_10_1109_TNNLS_2024_3493425
crossref_primary_10_1080_10447318_2023_2278926
crossref_primary_10_1016_j_neucom_2025_130418
crossref_primary_10_1016_j_neunet_2023_04_045
crossref_primary_10_3389_fnins_2024_1355512
crossref_primary_10_1080_10255842_2023_2252551
crossref_primary_10_1007_s12539_025_00750_2
crossref_primary_10_1016_j_knosys_2025_113613
crossref_primary_10_1088_1741_2552_ad618a
crossref_primary_10_1109_TNNLS_2023_3319315
crossref_primary_10_1007_s11571_025_10272_8
crossref_primary_10_1016_j_csi_2025_103973
crossref_primary_10_1016_j_eswa_2025_127456
crossref_primary_10_1109_TBCAS_2021_3089132
crossref_primary_10_1109_TSMC_2023_3340710
crossref_primary_10_1007_s11571_023_10004_w
crossref_primary_10_1007_s11517_025_03295_0
crossref_primary_10_1109_TSMC_2024_3458949
crossref_primary_10_3390_math13010087
crossref_primary_10_1109_TFUZZ_2024_3434709
crossref_primary_10_1109_JSEN_2022_3172133
crossref_primary_10_3389_fnins_2024_1479570
crossref_primary_10_1109_ACCESS_2024_3375393
crossref_primary_10_3389_fnins_2023_1213099
crossref_primary_10_1016_j_inffus_2023_102019
crossref_primary_10_1016_j_measurement_2024_116046
crossref_primary_10_1016_j_neucom_2024_128354
crossref_primary_10_1109_TAFFC_2024_3477302
crossref_primary_10_1088_1741_2552_ad085a
crossref_primary_10_1109_TAI_2024_3445325
crossref_primary_10_1109_TCYB_2025_3550191
crossref_primary_10_1109_TNSRE_2023_3320693
crossref_primary_10_1007_s10489_023_04971_0
crossref_primary_10_1109_JBHI_2024_3403188
crossref_primary_10_1007_s40998_024_00710_4
crossref_primary_10_1109_TIM_2025_3571107
crossref_primary_10_1016_j_neucom_2024_128920
crossref_primary_10_1016_j_knosys_2025_113752
crossref_primary_10_3390_math12081180
crossref_primary_10_1016_j_eswa_2025_129654
crossref_primary_10_1016_j_inffus_2023_102129
crossref_primary_10_1109_JBHI_2023_3307606
crossref_primary_10_3390_math13071072
crossref_primary_10_1016_j_inffus_2025_103417
crossref_primary_10_1016_j_bspc_2025_108443
crossref_primary_10_1016_j_bspc_2025_107594
crossref_primary_10_1016_j_bspc_2024_106249
crossref_primary_10_1007_s00371_024_03652_4
crossref_primary_10_3389_fninf_2024_1303380
crossref_primary_10_1109_ACCESS_2023_3344476
crossref_primary_10_1038_s41598_025_96616_0
crossref_primary_10_1007_s13534_023_00316_5
crossref_primary_10_1109_TIM_2024_3428607
crossref_primary_10_1088_1741_2552_ad9956
crossref_primary_10_1016_j_knosys_2025_114318
crossref_primary_10_1016_j_eswa_2025_129422
crossref_primary_10_3389_fnins_2023_1345770
crossref_primary_10_1007_s10462_023_10513_4
crossref_primary_10_3390_s24113464
Cites_doi 10.1109/TCYB.2018.2797176
10.1016/S0959-4388(02)00301-X
10.1109/34.954607
10.5555/2946645.2946704
10.1109/TAFFC.2018.2840973
10.1007/978-981-13-2354-6_5
10.1109/TAFFC.2017.2714671
10.1016/0013-4694(70)90143-4
10.18653/v1/p17-1036
10.1109/TAFFC.2018.2817622
10.1111/j.1469-8986.1993.tb03207.x
10.1109/TNSRE.2011.2174652
10.1109/EMBC.2013.6611075
10.1162/neco.1997.9.8.1735
10.1007/978-3-319-70093-9_86
10.1088/1741-2560/14/1/016003
10.1109/TAMD.2015.2431497
10.1016/j.biopsycho.2004.03.002
10.1109/TBME.2007.893452
10.1007/978-3-030-04221-9_25
10.3389/fnbeh.2018.00225
10.1016/j.neubiorev.2011.05.001
10.1038/s41598-018-26133-w
10.1609/aaai.v31i2.19105
10.1162/089976698300017467
10.1109/NER.2011.5910636
10.1016/j.cmpb.2016.08.010
10.1007/978-3-642-38803-3_6
10.1109/T-AFFC.2012.16
10.1109/TAFFC.2017.2660485
10.1007/s12553-019-00394-5
10.5555/3045118.3045336
10.1109/TNN.2010.2091281
10.18653/v1/N16-1180
10.1007/s00221-014-3902-4
10.18653/v1/D15-1166
10.24963/ijcai.2018/216
10.18653/v1/d16-1058
10.1109/TAFFC.2017.2772882
10.1109/TITB.2009.2034649
10.1109/BIBM.2016.7822545
10.1109/TBME.2014.2347318
10.1109/ACII.2015.7344684
10.1016/j.biopsycho.2004.03.003
10.1177/1534582302001001003
10.1109/TAFFC.2014.2339834
10.1037/h0054570
10.1109/TCDS.2016.2587290
10.1109/CVPR.2017.316
10.1109/TAFFC.2017.2712143
10.1109/T-AFFC.2011.15
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TAFFC.2020.3013711
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
DatabaseTitleList Computer and Information Systems Abstracts

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1949-3045
EndPage 1540
ExternalDocumentID 10_1109_TAFFC_2020_3013711
9154557
Genre orig-research
GrantInformation_xml – fundername: National Key Research and Development Program of China
  grantid: 2016YFB1001200
– fundername: National Natural Science Foundation of China; Natural Science Foundation of China
  grantid: U1736220; 61725204; 61872346
  funderid: 10.13039/501100001809
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABJNI
ABQJQ
ABVLG
AENEX
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
HZ~
IEDLZ
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
PQQKQ
RIA
RIE
RNI
RZB
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c339t-95dba7bd52dca2e53c5b84d2476e58198c8495cc1031639e5f1d87ba2629c0873
IEDL.DBID RIE
ISICitedReferencesCount 228
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000849263500029&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1949-3045
IngestDate Sun Nov 09 05:30:54 EST 2025
Sat Nov 29 04:16:08 EST 2025
Tue Nov 18 19:50:35 EST 2025
Wed Aug 27 02:29:15 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 3
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c339t-95dba7bd52dca2e53c5b84d2476e58198c8495cc1031639e5f1d87ba2629c0873
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-4438-5320
0000-0002-2094-5680
0000-0003-3999-7429
0000-0001-5774-1916
OpenAccessLink http://ir.psych.ac.cn/handle/311026/39338
PQID 2709156569
PQPubID 2040414
PageCount 13
ParticipantIDs crossref_citationtrail_10_1109_TAFFC_2020_3013711
ieee_primary_9154557
crossref_primary_10_1109_TAFFC_2020_3013711
proquest_journals_2709156569
PublicationCentury 2000
PublicationDate 2022-07-01
PublicationDateYYYYMMDD 2022-07-01
PublicationDate_xml – month: 07
  year: 2022
  text: 2022-07-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on affective computing
PublicationTitleAbbrev TAFFC
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref57
ref12
ref15
ref59
ref14
ref58
ref52
ref11
ref55
ref10
ref54
ref17
ref16
ref19
ref18
ref51
ref46
ref45
ref47
ref42
ref41
ref49
Defferrard (ref56)
ref8
ref7
ref9
ref4
ref3
ref6
ref5
ref40
Mnih (ref43)
ref35
Keltner (ref29) 2000
ref34
ref37
ref36
ref31
ref30
ref33
ref32
ref2
ref1
ref39
ref38
Bahdanau (ref44) 2014
Hoffman (ref48)
ref24
ref23
ref26
ref25
ref20
ref22
ref21
ref28
ref27
ref60
Ganin (ref53); 37
Long (ref50)
References_xml – ident: ref38
  doi: 10.1109/TCYB.2018.2797176
– ident: ref52
  doi: 10.1016/S0959-4388(02)00301-X
– ident: ref1
  doi: 10.1109/34.954607
– ident: ref47
  doi: 10.5555/2946645.2946704
– ident: ref19
  doi: 10.1109/TAFFC.2018.2840973
– ident: ref26
  doi: 10.1007/978-981-13-2354-6_5
– ident: ref37
  doi: 10.1109/TAFFC.2017.2714671
– ident: ref32
  doi: 10.1016/0013-4694(70)90143-4
– ident: ref55
  doi: 10.18653/v1/p17-1036
– ident: ref14
  doi: 10.1109/TAFFC.2018.2817622
– ident: ref58
  doi: 10.1111/j.1469-8986.1993.tb03207.x
– ident: ref8
  doi: 10.1109/TNSRE.2011.2174652
– ident: ref36
  doi: 10.1109/EMBC.2013.6611075
– ident: ref41
  doi: 10.1162/neco.1997.9.8.1735
– ident: ref39
  doi: 10.1007/978-3-319-70093-9_86
– ident: ref40
  doi: 10.1088/1741-2560/14/1/016003
– start-page: 1989
  ident: ref48
  article-title: CyCADA: Cycle-consistent adversarial domain adaptation
– ident: ref57
  doi: 10.1109/TAMD.2015.2431497
– ident: ref59
  doi: 10.1016/j.biopsycho.2004.03.002
– ident: ref6
  doi: 10.1109/TBME.2007.893452
– year: 2014
  ident: ref44
  article-title: Neural machine translation by jointly learning to align and translate
– ident: ref51
  doi: 10.1007/978-3-030-04221-9_25
– ident: ref16
  doi: 10.3389/fnbeh.2018.00225
– ident: ref5
  doi: 10.1016/j.neubiorev.2011.05.001
– ident: ref17
  doi: 10.1038/s41598-018-26133-w
– ident: ref23
  doi: 10.1609/aaai.v31i2.19105
– start-page: 2204
  volume-title: Proc. 27th Int. Conf. Neural Inf. Process. Syst.
  ident: ref43
  article-title: Recurrent models of visual attention
– ident: ref27
  doi: 10.1162/089976698300017467
– ident: ref35
  doi: 10.1109/NER.2011.5910636
– ident: ref10
  doi: 10.1016/j.cmpb.2016.08.010
– ident: ref33
  doi: 10.1007/978-3-642-38803-3_6
– start-page: 1647
  volume-title: Proc. 32nd Int. Conf. Neural Inf. Process. Syst.
  ident: ref50
  article-title: Conditional adversarial domain adaptation
– ident: ref3
  doi: 10.1109/T-AFFC.2012.16
– ident: ref11
  doi: 10.1109/TAFFC.2017.2660485
– start-page: 236
  volume-title: Handbook of Emotions
  year: 2000
  ident: ref29
  article-title: The psychophysiology of emotion
– volume: 37
  start-page: 1180
  ident: ref53
  article-title: Unsupervised domain adaptation by backpropagation
– ident: ref18
  doi: 10.1007/s12553-019-00394-5
– ident: ref42
  doi: 10.5555/3045118.3045336
– ident: ref28
  doi: 10.1109/TNN.2010.2091281
– ident: ref54
  doi: 10.18653/v1/N16-1180
– ident: ref60
  doi: 10.1007/s00221-014-3902-4
– ident: ref45
  doi: 10.18653/v1/D15-1166
– ident: ref25
  doi: 10.24963/ijcai.2018/216
– ident: ref46
  doi: 10.18653/v1/d16-1058
– ident: ref7
  doi: 10.1109/TAFFC.2017.2772882
– ident: ref34
  doi: 10.1109/TITB.2009.2034649
– start-page: 3837
  volume-title: Proc. 30th Int. Conf. Neural Inf. Process. Syst.
  ident: ref56
  article-title: Convolutional neural networks on graphs with fast localized spectral filtering
– ident: ref12
  doi: 10.1109/BIBM.2016.7822545
– ident: ref9
  doi: 10.1109/TBME.2014.2347318
– ident: ref22
  doi: 10.1109/ACII.2015.7344684
– ident: ref15
  doi: 10.1016/j.biopsycho.2004.03.003
– ident: ref4
  doi: 10.1177/1534582302001001003
– ident: ref31
  doi: 10.1109/TAFFC.2014.2339834
– ident: ref30
  doi: 10.1037/h0054570
– ident: ref13
  doi: 10.1109/TCDS.2016.2587290
– ident: ref49
  doi: 10.1109/CVPR.2017.316
– ident: ref20
  doi: 10.1109/TAFFC.2017.2712143
– ident: ref2
  doi: 10.1109/T-AFFC.2011.15
– ident: ref24
  doi: 10.1109/TNN.2010.2091281
– ident: ref21
  doi: 10.1109/TAMD.2015.2431497
SSID ssj0000333627
Score 2.662248
Snippet Most previous EEG-based emotion recognition methods studied hand-crafted EEG features extracted from different electrodes. In this article, we study the...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1528
SubjectTerms attention mechanism
Brain modeling
Channels
Data models
domain adaptation
Domains
Electrodes
Electroencephalography
Emotion recognition
Emotions
Feature extraction
Frequency-domain analysis
LSTM
Machine learning
multichannel EEG
Title An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals
URI https://ieeexplore.ieee.org/document/9154557
https://www.proquest.com/docview/2709156569
Volume 13
WOSCitedRecordID wos000849263500029&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1949-3045
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000333627
  issn: 1949-3045
  databaseCode: RIE
  dateStart: 20100101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB508eDFt7i-yMGbVpukaZLjIq0edBV3hb2VNklFWLuy7vr7TdIHiCJ4KOSQlJJpMt8k880HcJZjEZc28AmMCmkQMRYHeamigGkqsY248rzQXmyCD4diMpGPK3DRcWGMMT75zFy6pr_L1zO1dEdlV9L5e8ZXYZVzXnO1uvOUkFK7F_OWFxPKq_EgTa9tBEhsYOoK62H8zfd4MZUfO7B3K-nm_z5oCzYa-IgGtb23YcVUO7DZSjOgZqXuwsOgQomvDmFfgO5G43s0rBO-kUWpKKnFe9BTmz5k2-l89oY8H9eRgSszRUlyg0avL67C8h48p8n4-jZotBMCRalcBJLpIueFZkSrnBhGFStEpEnEY8MsChBK2NBIKafyYEGKYSXWghc5iYlUoeB0H3rVrDIHgOKiFBpLViiLnUpVSkw4liUNy8hQ-_QBt7OaqaawuNO3mGY-wAhl5i2ROUtkjSX6cN6Nea_LavzZe9fNfdezmfY-HLfGy5qV95ERbhGQQ6ny8PdRR7BOHIXBp9weQ28xX5oTWFOfi9eP-an_qb4AL2vJNg
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS8MwFD7MC-iLd3Fe8-CbVpukaZrHMVonblN0gm-lTVIZaCdz-vtN0nYgiuBDIQ9JKTlNzneS850P4DTDUViYwMfT0qdewFjoZYUMPKaowCbiyrJcObEJPhxGT0_irgXncy6M1toln-kL23R3-WoiP-xR2aWw_p7xBVhiQUBwxdaan6j4lJrdmDfMGF9cjjpJ0jUxIDGhqS2th_E37-PkVH7swc6xJOv_-6QNWKsBJOpUFt-Eli63YL0RZ0D1Wt2G206JYlcfwrwA9R9GAzSsUr6RwakoruR70H2TQGTayXTyihwj19KBS_2C4vgKPYyfbY3lHXhM4lG359XqCZ6kVMw8wVSe8VwxomRGNKOS5VGgSMBDzQwOiGRkgiMprc6DgSmaFVhFPM9ISIT0I053YbGclHoPUJgXkcKC5dKgp0IWAhOORUH9ItDUPG3Azaymsi4tbhUuXlIXYvgidZZIrSXS2hJtOJuPeasKa_zZe9vO_bxnPe1tOGyMl9Zr7z0l3GAgi1PF_u-jTmClNxr00_718OYAVoklNLgE3ENYnE0_9BEsy8_Z-H167H6wL9G6zH0
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=An+Efficient+LSTM+Network+for+Emotion+Recognition+From+Multichannel+EEG+Signals&rft.jtitle=IEEE+transactions+on+affective+computing&rft.au=Du%2C+Xiaobing&rft.au=Ma%2C+Cuixia&rft.au=Zhang%2C+Guanhua&rft.au=Li%2C+Jinyao&rft.date=2022-07-01&rft.issn=1949-3045&rft.eissn=1949-3045&rft.volume=13&rft.issue=3&rft.spage=1528&rft.epage=1540&rft_id=info:doi/10.1109%2FTAFFC.2020.3013711&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TAFFC_2020_3013711
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1949-3045&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1949-3045&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1949-3045&client=summon