PGCN: Pyramidal Graph Convolutional Network for EEG Emotion Recognition

Emotion recognition is essential in the diagnosis and rehabilitation of various mental diseases. In the last decade, electroencephalogram (EEG)-based emotion recognition has been intensively investigated due to its prominative accuracy and reliability, and graph convolutional network (GCN) has becom...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia Jg. 26; S. 9070 - 9082
Hauptverfasser: Jin, Ming, Du, Changde, He, Huiguang, Cai, Ting, Li, Jinpeng
Format: Journal Article
Sprache:Englisch
Veröffentlicht: IEEE 2024
Schlagworte:
ISSN:1520-9210, 1941-0077
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract Emotion recognition is essential in the diagnosis and rehabilitation of various mental diseases. In the last decade, electroencephalogram (EEG)-based emotion recognition has been intensively investigated due to its prominative accuracy and reliability, and graph convolutional network (GCN) has become a mainstream model to decode emotions from EEG signals. However, the electrode relationship, especially long-range electrode dependencies across the scalp, may be underutilized by GCNs, although such relationships have been proven to be important in emotion recognition. The small receptive field makes shallow GCNs only aggregate local nodes. On the other hand, stacking too many layers leads to over-smoothing. To solve these problems, we propose the pyramidal graph convolutional network (PGCN), which aggregates features at three levels: local , mesoscopic , and global . First, we construct a vanilla GCN based on the 3D topological relationships of electrodes, which is used to integrate two-order local features; Second, we construct several mesoscopic brain regions based on priori knowledge and employ mesoscopic attention to sequentially calculate the virtual mesoscopic centers to focus on the functional connections of mesoscopic brain regions; Finally, we fuse the node features and their 3D positions to construct a numerical relationship adjacency matrix to integrate structural and functional connections from the global perspective. Experimental results on four public datasets indicate that PGCN enhances the relationship modelling across the scalp and achieves state-of-the-art performance in both subject-dependent and subject-independent scenarios. Meanwhile, PGCN makes an effective trade-off between enhancing network depth and receptive fields while suppressing the ensuing over-smoothing.
AbstractList Emotion recognition is essential in the diagnosis and rehabilitation of various mental diseases. In the last decade, electroencephalogram (EEG)-based emotion recognition has been intensively investigated due to its prominative accuracy and reliability, and graph convolutional network (GCN) has become a mainstream model to decode emotions from EEG signals. However, the electrode relationship, especially long-range electrode dependencies across the scalp, may be underutilized by GCNs, although such relationships have been proven to be important in emotion recognition. The small receptive field makes shallow GCNs only aggregate local nodes. On the other hand, stacking too many layers leads to over-smoothing. To solve these problems, we propose the pyramidal graph convolutional network (PGCN), which aggregates features at three levels: local , mesoscopic , and global . First, we construct a vanilla GCN based on the 3D topological relationships of electrodes, which is used to integrate two-order local features; Second, we construct several mesoscopic brain regions based on priori knowledge and employ mesoscopic attention to sequentially calculate the virtual mesoscopic centers to focus on the functional connections of mesoscopic brain regions; Finally, we fuse the node features and their 3D positions to construct a numerical relationship adjacency matrix to integrate structural and functional connections from the global perspective. Experimental results on four public datasets indicate that PGCN enhances the relationship modelling across the scalp and achieves state-of-the-art performance in both subject-dependent and subject-independent scenarios. Meanwhile, PGCN makes an effective trade-off between enhancing network depth and receptive fields while suppressing the ensuing over-smoothing.
Author He, Huiguang
Cai, Ting
Jin, Ming
Li, Jinpeng
Du, Changde
Author_xml – sequence: 1
  givenname: Ming
  orcidid: 0000-0001-7687-9953
  surname: Jin
  fullname: Jin, Ming
  organization: School of Automation Science and Engineering, South China University of Technology, Guangzhou, China
– sequence: 2
  givenname: Changde
  orcidid: 0000-0002-0084-433X
  surname: Du
  fullname: Du, Changde
  organization: Research Center for Brain-Inspired Intelligence, Institute of Automation, Chinese Academy of Sciences, Beijing, China
– sequence: 3
  givenname: Huiguang
  orcidid: 0000-0002-0684-1711
  surname: He
  fullname: He, Huiguang
  organization: National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China
– sequence: 4
  givenname: Ting
  orcidid: 0000-0001-9649-361X
  surname: Cai
  fullname: Cai, Ting
  organization: No. 2 Hospital, Ningbo, China
– sequence: 5
  givenname: Jinpeng
  orcidid: 0000-0001-8701-2642
  surname: Li
  fullname: Li, Jinpeng
  email: jinpeng.li@ieee.org
  organization: School of Automation Science and Engineering, South China University of Technology, Guangzhou, China
BookMark eNp9kE1PwzAMQCM0JLbBnQOH_IGOJM3Hwg1VpSBtY0LjXGWNA4GumdIC2r-nFTsgDpxs2X6W_SZo1IQGELqkZEYp0deb5XLGCOOzNJ0LqeQJGlPNaUKIUqM-F4wkmlFyhiZt-0YI5YKoMSrWRba6wetDNDtvTY2LaPavOAvNZ6g_Oh-avraC7ivEd-xCxHle4HwXhg5-giq8NH7Iz9GpM3ULF8c4Rc93-Sa7TxaPxUN2u0gqJnmXgEgZq6jTThCwXHErttxsjTNAU7tVzFnptHZzagUoxpWRslIOjOm_slykUyR_9lYxtG0EV1a-M8MFXTS-LikpBx1lr6McdJRHHT1I_oD76HcmHv5Drn4QDwC_xrmWVNP0G6f-bXU
CODEN ITMUF8
CitedBy_id crossref_primary_10_1016_j_bspc_2025_108231
crossref_primary_10_1007_s00521_024_10821_y
crossref_primary_10_1038_s41598_024_82705_z
crossref_primary_10_1016_j_bspc_2025_108556
crossref_primary_10_1109_JBHI_2024_3504847
crossref_primary_10_1109_TAFFC_2025_3527459
crossref_primary_10_1016_j_eswa_2025_128183
crossref_primary_10_1016_j_knosys_2025_113752
crossref_primary_10_1109_TIFS_2025_3602266
crossref_primary_10_1007_s00180_025_01666_7
crossref_primary_10_1016_j_neucom_2025_130254
crossref_primary_10_1007_s00371_024_03652_4
crossref_primary_10_1016_j_aej_2025_09_013
crossref_primary_10_1016_j_cmpb_2025_109021
crossref_primary_10_1016_j_bspc_2025_107799
crossref_primary_10_1007_s00530_025_01894_3
crossref_primary_10_1016_j_knosys_2025_113938
crossref_primary_10_1109_TNSRE_2025_3603190
crossref_primary_10_1007_s10586_024_04994_3
crossref_primary_10_3389_fnins_2024_1479570
crossref_primary_10_1016_j_knosys_2025_114115
crossref_primary_10_1109_TIM_2025_3553234
crossref_primary_10_1109_TAFFC_2025_3564272
crossref_primary_10_1016_j_eswa_2025_127323
crossref_primary_10_1186_s40708_024_00242_x
crossref_primary_10_1016_j_neunet_2025_107457
crossref_primary_10_1016_j_eswa_2025_128035
Cites_doi 10.1002/da.22728
10.1093/cercor/bhn102
10.1109/TAFFC.2017.2712143
10.1016/j.neucom.2017.08.043
10.1109/EMBC46164.2021.9630195
10.1016/j.neubiorev.2017.04.021
10.1016/j.nicl.2020.102331
10.1109/79.911197
10.1109/TNSRE.2020.2980223
10.7551/mitpress/9609.001.0001
10.1109/NER.2013.6695876
10.1109/TNN.2010.2091281
10.1109/TAFFC.2018.2885474
10.1016/j.neuroimage.2003.09.055
10.1109/TAFFC.2021.3064940
10.1109/TAFFC.2020.2994159
10.5555/2946645.2946704
10.1109/TCDS.2019.2963476
10.3389/fpsyt.2020.00698
10.1109/TCYB.2018.2797176
10.1023/A:1018628609742
10.1109/TPAMI.2021.3074057
10.1109/TCDS.2017.2685338
10.1109/TAFFC.2019.2937768
10.1109/TAFFC.2018.2817622
10.1145/3326362
10.1371/journal.pcbi.0010042
10.1109/TCDS.2020.2999337
10.1088/1741-2552/aace8c
10.1109/ijcnn48605.2020.9206750
10.1109/tnnls.2023.3236635
10.1016/j.neunet.2019.04.003
10.1093/cercor/bhi016
10.1109/NER.2019.8717055
10.1109/TAMD.2015.2431497
10.1109/TCYB.2019.2905157
10.1371/journal.pbio.0060159
10.1109/ICASSP.2018.8462440
10.14569/ijacsa.2017.081046
10.1109/TBME.2010.2048568
10.1145/3474085.3475697
10.1109/TSMCA.2008.918624
10.1093/cercor/bhl149
10.1109/ACCESS.2019.2891579
10.1109/CVPR.2016.90
10.1609/aaai.v34i04.5747
10.1109/TAFFC.2018.2874986
10.1016/j.tins.2004.02.007
10.24963/ijcai.2017/250
10.1109/TITS.2019.2935152
10.1109/TCDS.2021.3071170
10.1038/nrn3214
10.1109/TAFFC.2020.3025777
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
DOI 10.1109/TMM.2024.3385676
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1941-0077
EndPage 9082
ExternalDocumentID 10_1109_TMM_2024_3385676
10496191
Genre orig-research
GrantInformation_xml – fundername: Guangdong Provincial Key Laboratory of Human Digital Twin
  grantid: 2022B1212010004
– fundername: Ningbo Clinical Research Center for Medical Imaging
  grantid: 2021L003
– fundername: National Natural Science Foundation of China
  grantid: 62106248
  funderid: 10.13039/501100001809
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
TN5
VH1
ZY4
AAYXX
CITATION
ID FETCH-LOGICAL-c264t-e5322c1f9f50ed474d5b4abafae13db72fd6f99f81d5e7247a66c7feaa385d453
IEDL.DBID RIE
ISICitedReferencesCount 37
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001297535300026&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1520-9210
IngestDate Sat Nov 29 03:10:15 EST 2025
Tue Nov 18 22:26:24 EST 2025
Wed Aug 27 01:54:45 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c264t-e5322c1f9f50ed474d5b4abafae13db72fd6f99f81d5e7247a66c7feaa385d453
ORCID 0000-0001-9649-361X
0000-0001-7687-9953
0000-0002-0084-433X
0000-0001-8701-2642
0000-0002-0684-1711
PageCount 13
ParticipantIDs ieee_primary_10496191
crossref_citationtrail_10_1109_TMM_2024_3385676
crossref_primary_10_1109_TMM_2024_3385676
PublicationCentury 2000
PublicationDate 20240000
2024-00-00
PublicationDateYYYYMMDD 2024-01-01
PublicationDate_xml – year: 2024
  text: 20240000
PublicationDecade 2020
PublicationTitle IEEE transactions on multimedia
PublicationTitleAbbrev TMM
PublicationYear 2024
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref57
ref12
ref56
ref15
ref14
ref53
ref52
ref11
ref55
ref10
ref54
ref17
ref16
ref19
ref18
Velikovi (ref44) 2017
Defferrard (ref36) 2016
ref51
ref50
ref46
ref45
ref48
ref47
ref41
ref43
ref49
ref8
ref7
ref9
ref4
ref3
ref6
ref5
ref40
ref35
ref34
Jawabri (ref42) 2023
ref37
ref31
ref30
Kipf (ref33) 2016
ref32
ref2
ref1
ref39
ref38
ref24
ref23
ref26
ref25
ref20
ref22
ref21
ref28
ref27
ref29
References_xml – ident: ref3
  doi: 10.1002/da.22728
– ident: ref18
  doi: 10.1093/cercor/bhn102
– start-page: 3844
  volume-title: Proc. Annu. Conf. Neural Inf. Process. Syst.
  year: 2016
  ident: ref36
  article-title: Convolutional neural networks on graphs with fast localized spectral filtering
– ident: ref43
  doi: 10.1109/TAFFC.2017.2712143
– ident: ref8
  doi: 10.1016/j.neucom.2017.08.043
– ident: ref38
  doi: 10.1109/EMBC46164.2021.9630195
– ident: ref41
  doi: 10.1016/j.neubiorev.2017.04.021
– ident: ref2
  doi: 10.1016/j.nicl.2020.102331
– ident: ref1
  doi: 10.1109/79.911197
– ident: ref11
  doi: 10.1109/TNSRE.2020.2980223
– ident: ref23
  doi: 10.7551/mitpress/9609.001.0001
– ident: ref25
  doi: 10.1109/NER.2013.6695876
– ident: ref54
  doi: 10.1109/TNN.2010.2091281
– ident: ref49
  doi: 10.1109/TAFFC.2018.2885474
– ident: ref24
  doi: 10.1016/j.neuroimage.2003.09.055
– ident: ref14
  doi: 10.1109/TAFFC.2021.3064940
– ident: ref13
  doi: 10.1109/TAFFC.2020.2994159
– ident: ref55
  doi: 10.5555/2946645.2946704
– ident: ref4
  doi: 10.1109/TCDS.2019.2963476
– ident: ref9
  doi: 10.3389/fpsyt.2020.00698
– ident: ref46
  doi: 10.1109/TCYB.2018.2797176
– ident: ref26
  doi: 10.1023/A:1018628609742
– ident: ref39
  doi: 10.1109/TPAMI.2021.3074057
– ident: ref29
  doi: 10.1109/TCDS.2017.2685338
– ident: ref12
  doi: 10.1109/TAFFC.2019.2937768
– ident: ref10
  doi: 10.1109/TAFFC.2018.2817622
– ident: ref45
  doi: 10.1145/3326362
– ident: ref17
  doi: 10.1371/journal.pcbi.0010042
– ident: ref51
  doi: 10.1109/TCDS.2020.2999337
– ident: ref22
  doi: 10.1088/1741-2552/aace8c
– ident: ref21
  doi: 10.1109/ijcnn48605.2020.9206750
– ident: ref37
  doi: 10.1109/tnnls.2023.3236635
– ident: ref27
  doi: 10.1016/j.neunet.2019.04.003
– ident: ref40
  doi: 10.1093/cercor/bhi016
– ident: ref52
  doi: 10.1109/NER.2019.8717055
– ident: ref28
  doi: 10.1109/TAMD.2015.2431497
– ident: ref32
  doi: 10.1109/TCYB.2019.2905157
– volume-title: StatPearls
  year: 2023
  ident: ref42
  article-title: Physiology, cerebral cortex functions
– ident: ref16
  doi: 10.1371/journal.pbio.0060159
– ident: ref6
  doi: 10.1109/ICASSP.2018.8462440
– ident: ref30
  doi: 10.14569/ijacsa.2017.081046
– ident: ref50
  doi: 10.1109/TBME.2010.2048568
– ident: ref53
  doi: 10.1145/3474085.3475697
– ident: ref5
  doi: 10.1109/TSMCA.2008.918624
– ident: ref15
  doi: 10.1093/cercor/bhl149
– ident: ref48
  doi: 10.1109/ACCESS.2019.2891579
– year: 2017
  ident: ref44
  article-title: Graph attention networks
– ident: ref56
  doi: 10.1109/CVPR.2016.90
– ident: ref57
  doi: 10.1609/aaai.v34i04.5747
– ident: ref7
  doi: 10.1109/TAFFC.2018.2874986
– ident: ref20
  doi: 10.1016/j.tins.2004.02.007
– ident: ref34
  doi: 10.24963/ijcai.2017/250
– ident: ref35
  doi: 10.1109/TITS.2019.2935152
– ident: ref47
  doi: 10.1109/TCDS.2021.3071170
– ident: ref19
  doi: 10.1038/nrn3214
– year: 2016
  ident: ref33
  article-title: Semi-supervised classification with graph convolutional networks
– ident: ref31
  doi: 10.1109/TAFFC.2020.3025777
SSID ssj0014507
Score 2.5586882
Snippet Emotion recognition is essential in the diagnosis and rehabilitation of various mental diseases. In the last decade, electroencephalogram (EEG)-based emotion...
SourceID crossref
ieee
SourceType Enrichment Source
Index Database
Publisher
StartPage 9070
SubjectTerms Convolutional neural networks
Electrodes
Electroencephalogram
Electroencephalography
Emotion recognition
Feature extraction
graph convolutional network
Knowledge engineering
knowledge-based modelling
Scalp
Title PGCN: Pyramidal Graph Convolutional Network for EEG Emotion Recognition
URI https://ieeexplore.ieee.org/document/10496191
Volume 26
WOSCitedRecordID wos001297535300026&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1941-0077
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014507
  issn: 1520-9210
  databaseCode: RIE
  dateStart: 19990101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NS8MwFA86POjB6Zw4v8jBi4duaZs0qzcZWz24MmTKbiVNExhoK3Mb-N_7krajFwVvJeRBeS8f75f33u8hdEcNUzZgaYf5XDjUJb6T-po5rqXfT4eK2tYJb888joeLRTiritVtLYxSyiafqb75tLH8rJAb81QGO5yG4PAD2NnnPCiLtXYhA8psbTTcR8QJAcjUMUkSDubTKSBBj_YBj7HA0Is07qBGUxV7p0za__ybE3RcOY_4sbT2KdpTeQe168YMuNqnHXTUYBk8Q9EsGsUPePa9Eh_LDOQjQ1KNR0W-rdYdjMVlOjgGHxaPxxEel-198EudYFTkXfQ6Gc9HT07VP8GR4OasHcVgt0pXh5oRlVFOM5ZSkQotlGtolT2dBToMNbisTHGPchEEkmslBKgpo8w_R628yNUFwjIg3CNCu4EAwKKkkNolqZv5RHgZnAE9NKg1msiKXNz0uHhPLMggYQI2SIwNksoGPXS_k_gsiTX-mNs16m_MKzV_-cv4FTo04uVLyTVqrVcbdYMO5Ha9_Frd2mXzAzycvr0
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFA4yBfXgdE6cP3Pw4qFb2ibt6k1G14lbGTJlt5KmCQy0lbkN_O99Sbuxi4K3EpJQ3svLe1-S9z2E7qhmygYsbTHX5xa1iWulrmKWbej3066kpnTC29CP4-50GoyrZHWTCyOlNI_PZFt_mrv8rBBLfVQGFk4DCPgB7OwymIOU6VqbSwPKTHY0eCRiBQBl1reSJOhMRiPAgg5tAyJjniYY2fJCW2VVjFfp1__5P8foqAof8WOp7xO0I_MGqq9LM-DKUhvocItn8BRF46gXP-Dx95x_zDIYH2maatwr8lW18qAtLh-EY4hicRhGOCwL_OCX9ROjIm-i13446Q2sqoKCJSDQWViSgb0KWwWKEZlRn2YspTzliktbEys7KvNUECgIWpn0HepzzxO-kpyDmDLK3DNUy4tcniMsPOI7hCvb4wBZpOBC2SS1M5dwJ4NdoIU6a4kmoqIX11Uu3hMDM0iQgA4SrYOk0kEL3W9GfJbUGn_0bWrxb_UrJX_xS_st2h9MRsNk-BQ_X6IDPVV5bnKFaov5Ul6jPbFazL7mN2YJ_QBVgcIE
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=PGCN%3A+Pyramidal+Graph+Convolutional+Network+for+EEG+Emotion+Recognition&rft.jtitle=IEEE+transactions+on+multimedia&rft.au=Jin%2C+Ming&rft.au=Du%2C+Changde&rft.au=He%2C+Huiguang&rft.au=Cai%2C+Ting&rft.date=2024&rft.issn=1520-9210&rft.eissn=1941-0077&rft.volume=26&rft.spage=9070&rft.epage=9082&rft_id=info:doi/10.1109%2FTMM.2024.3385676&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TMM_2024_3385676
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1520-9210&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1520-9210&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1520-9210&client=summon