Emotion Recognition Empowered Human-Computer Interaction With Domain Adaptation Network
Multi-modal emotion recognition plays a vital role in the human-computer interaction (HCI) for consumer electronics. Nowadays, many studies have developed multi-modal fusion algorithms for this purpose. However, two challenging issues remain unsolved, i.e., inefficient multi-modal feature fusion and...
Saved in:
| Published in: | IEEE transactions on consumer electronics Vol. 71; no. 2; pp. 6777 - 6786 |
|---|---|
| Main Authors: | , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York
IEEE
01.05.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 0098-3063, 1558-4127 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Multi-modal emotion recognition plays a vital role in the human-computer interaction (HCI) for consumer electronics. Nowadays, many studies have developed multi-modal fusion algorithms for this purpose. However, two challenging issues remain unsolved, i.e., inefficient multi-modal feature fusion and unclear distance in feature space. To this end, we develop a novel framework, namely LAFDA-Net, for cross-subject emotion recognition using EEG and eye movement signals. It is based on low-rank fusion and domain adaptation network. More specifically, the multi-modal signals are input into the feature extraction branch in parallel to generate features. Then, these features are fused by the low-rank fusion branch, reducing complexity and avoiding overfitting. Next, the fused features are flattened and sent to the classification branch to determine the emotion status. During training, these features are input into the domain adaptive branch to bridge the gap between the source domain and target domain. Three benchmark datasets, i.e., SEED, SEED-IV, and SEED-V, are employed for performance validation. Extensive results demonstrate that the proposed LAFDA-Net is robust, effective, and has advantages over peer methods. |
|---|---|
| AbstractList | Multi-modal emotion recognition plays a vital role in the human-computer interaction (HCI) for consumer electronics. Nowadays, many studies have developed multi-modal fusion algorithms for this purpose. However, two challenging issues remain unsolved, i.e., inefficient multi-modal feature fusion and unclear distance in feature space. To this end, we develop a novel framework, namely LAFDA-Net, for cross-subject emotion recognition using EEG and eye movement signals. It is based on low-rank fusion and domain adaptation network. More specifically, the multi-modal signals are input into the feature extraction branch in parallel to generate features. Then, these features are fused by the low-rank fusion branch, reducing complexity and avoiding overfitting. Next, the fused features are flattened and sent to the classification branch to determine the emotion status. During training, these features are input into the domain adaptive branch to bridge the gap between the source domain and target domain. Three benchmark datasets, i.e., SEED, SEED-IV, and SEED-V, are employed for performance validation. Extensive results demonstrate that the proposed LAFDA-Net is robust, effective, and has advantages over peer methods. |
| Author | Xu, Xu Chen, Junxin Fu, Chong |
| Author_xml | – sequence: 1 givenname: Xu orcidid: 0000-0002-3934-9096 surname: Xu fullname: Xu, Xu organization: School of Computer Science and Engineering, Northeastern University, Shenyang, China – sequence: 2 givenname: Chong orcidid: 0000-0002-4549-744X surname: Fu fullname: Fu, Chong email: fuchong@mail.neu.edu.cn organization: School of Computer Science and Engineering, Northeastern University, Shenyang, China – sequence: 3 givenname: Junxin orcidid: 0000-0003-4745-8361 surname: Chen fullname: Chen, Junxin organization: School of Software, Dalian University of Technology, Dalian, China |
| BookMark | eNpNkE1Lw0AQhhepYFu9e_AQ8Jw6-5nsscRoC0VBKj0um-1GU81u3CQU_73px8HLzDA87ww8EzRy3lmEbjHMMAb5sM7yGQHCZpQTxgBfoDHmPI0ZJskIjQFkGlMQ9ApN2nYHgBkn6Rht8tp3lXfRmzX-w1XHOa8bv7fBbqNFX2sXZ75u-s6GaOmGqs0R2lTdZ_Toa125aL7VTaeP6xfb7X34ukaXpf5u7c25T9H7U77OFvHq9XmZzVexIYx3sTYpJAnmmgtGuLRUC1HKgptSCkMNpRQklIaXXBYkEQVoXLBCpFAWguBU0im6P91tgv_pbdupne-DG14qSlhCuRScDRScKBN82wZbqiZUtQ6_CoM66FODPnXQp876hsjdKVJZa__hKZaDPPoHwyNtDw |
| CODEN | ITCEDA |
| Cites_doi | 10.1109/jbhi.2024.3422472 10.1007/s10489-023-05097-z 10.1109/TAFFC.2024.3392791 10.1145/3581783.3613797 10.1088/1741-2552/ac49a7 10.1016/j.inffus.2022.09.012 10.1016/j.eswa.2024.124001 10.1109/CVPR.2018.00745 10.1109/T-AFFC.2011.15 10.1109/TAI.2023.3347178 10.1109/TCE.2023.3325317 10.1109/JAS.2022.105515 10.1109/TAMD.2015.2431497 10.1109/TCYB.2018.2797176 10.1007/978-3-319-58347-1_10 10.7717/peerj-cs.1977 10.1109/TCDS.2021.3071170 10.1109/BIBM58861.2023.10385505 10.1016/j.asoc.2021.107752 10.18653/v1/P18-1209 10.1016/j.eswa.2020.114088 10.1109/TCE.2021.3056421 10.1109/TCE.2024.3351190 10.1016/j.inffus.2023.102129 10.1016/j.knosys.2021.107982 10.1109/JBHI.2017.2688239 10.48550/ARXIV.1807.06521 10.1109/TCBB.2022.3140306 10.1109/TCSS.2023.3298324 10.1088/1741-2552/ac5c8d 10.1007/s13042-023-01964-w 10.1109/TCE.2023.3263672 10.1007/s12559-024-10327-8 10.1109/TII.2021.3088465 10.1109/TCBB.2023.3247433 10.1016/j.compbiomed.2023.107450 10.1109/ICASSP48485.2024.10446937 10.1109/TAFFC.2024.3357656 10.1109/TPAMI.2023.3268209 10.1109/TAFFC.2021.3134183 10.1109/TII.2022.3217120 10.1016/j.bspc.2022.103687 10.1016/j.inffus.2021.07.007 10.1109/tetci.2024.3406422 10.1109/TCE.2023.3325335 10.1109/TAFFC.2022.3189222 10.1109/TCDS.2019.2949306 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025 |
| DBID | 97E RIA RIE AAYXX CITATION 7SP 8FD F28 FR3 L7M |
| DOI | 10.1109/TCE.2024.3524401 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998-Present IEEE Electronic Library (IEL) CrossRef Electronics & Communications Abstracts Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Advanced Technologies Database with Aerospace |
| DatabaseTitle | CrossRef Engineering Research Database Technology Research Database Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Electronics & Communications Abstracts |
| DatabaseTitleList | Engineering Research Database |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 1558-4127 |
| EndPage | 6786 |
| ExternalDocumentID | 10_1109_TCE_2024_3524401 10819001 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: Fundamental Research Funds for the Central Universities grantid: N2424010-18; DUT22RC(3)099 funderid: 10.13039/501100012226 – fundername: Liaoning Provincial Science and Technology Plan Project grantid: 2023JH2/101700370 – fundername: Xiaomi Young Talents Program – fundername: National Natural Science Foundation of China grantid: 62171114 funderid: 10.13039/501100001809 |
| GroupedDBID | -~X .DC 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABJNI ABQJQ ABVLG ACGFO ACIWK ACKIV ACNCT AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI MS~ O9- OCL P2P RIA RIE RNS TAE TN5 VH1 AAYXX CITATION 7SP 8FD F28 FR3 L7M |
| ID | FETCH-LOGICAL-c245t-ac807715a564259e3a66f9b5cf96c3c333090fc5f59b276b0a1b4b680fb621893 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 0 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001554477900014&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0098-3063 |
| IngestDate | Thu Nov 27 15:43:10 EST 2025 Sat Nov 29 07:39:14 EST 2025 Wed Aug 27 07:36:57 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 2 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c245t-ac807715a564259e3a66f9b5cf96c3c333090fc5f59b276b0a1b4b680fb621893 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0003-4745-8361 0000-0002-4549-744X 0000-0002-3934-9096 |
| PQID | 3247359654 |
| PQPubID | 85469 |
| PageCount | 10 |
| ParticipantIDs | ieee_primary_10819001 proquest_journals_3247359654 crossref_primary_10_1109_TCE_2024_3524401 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-05-01 |
| PublicationDateYYYYMMDD | 2025-05-01 |
| PublicationDate_xml | – month: 05 year: 2025 text: 2025-05-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | New York |
| PublicationPlace_xml | – name: New York |
| PublicationTitle | IEEE transactions on consumer electronics |
| PublicationTitleAbbrev | T-CE |
| PublicationYear | 2025 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref35 ref12 ref34 ref15 ref37 ref14 ref36 ref31 ref30 ref11 ref33 ref10 ref32 Wang (ref44) 2018 ref2 ref1 ref17 ref39 ref16 ref38 ref19 ref18 ref24 ref46 ref23 ref45 ref26 ref48 ref25 ref47 ref20 ref42 ref41 ref22 ref21 ref43 ref28 ref27 ref29 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 |
| References_xml | – ident: ref34 doi: 10.1109/jbhi.2024.3422472 – ident: ref32 doi: 10.1007/s10489-023-05097-z – ident: ref29 doi: 10.1109/TAFFC.2024.3392791 – ident: ref16 doi: 10.1145/3581783.3613797 – ident: ref11 doi: 10.1088/1741-2552/ac49a7 – ident: ref31 doi: 10.1016/j.inffus.2022.09.012 – ident: ref14 doi: 10.1016/j.eswa.2024.124001 – ident: ref43 doi: 10.1109/CVPR.2018.00745 – ident: ref19 doi: 10.1109/T-AFFC.2011.15 – ident: ref46 doi: 10.1109/TAI.2023.3347178 – ident: ref1 doi: 10.1109/TCE.2023.3325317 – ident: ref12 doi: 10.1109/JAS.2022.105515 – ident: ref17 doi: 10.1109/TAMD.2015.2431497 – ident: ref18 doi: 10.1109/TCYB.2018.2797176 – ident: ref35 doi: 10.1007/978-3-319-58347-1_10 – ident: ref28 doi: 10.7717/peerj-cs.1977 – ident: ref9 doi: 10.1109/TCDS.2021.3071170 – ident: ref24 doi: 10.1109/BIBM58861.2023.10385505 – ident: ref21 doi: 10.1016/j.asoc.2021.107752 – ident: ref33 doi: 10.18653/v1/P18-1209 – ident: ref37 doi: 10.1016/j.eswa.2020.114088 – ident: ref2 doi: 10.1109/TCE.2021.3056421 – ident: ref4 doi: 10.1109/TCE.2024.3351190 – ident: ref42 doi: 10.1016/j.inffus.2023.102129 – ident: ref41 doi: 10.1016/j.knosys.2021.107982 – ident: ref20 doi: 10.1109/JBHI.2017.2688239 – ident: ref45 doi: 10.48550/ARXIV.1807.06521 – year: 2018 ident: ref44 article-title: Parameter-free spatial attention network for person re-identification publication-title: arXiv:1811.12150 – ident: ref47 doi: 10.1109/TCBB.2022.3140306 – ident: ref22 doi: 10.1109/TCSS.2023.3298324 – ident: ref23 doi: 10.1088/1741-2552/ac5c8d – ident: ref27 doi: 10.1007/s13042-023-01964-w – ident: ref6 doi: 10.1109/TCE.2023.3263672 – ident: ref7 doi: 10.1007/s12559-024-10327-8 – ident: ref48 doi: 10.1109/TII.2021.3088465 – ident: ref15 doi: 10.1109/TCBB.2023.3247433 – ident: ref5 doi: 10.1016/j.compbiomed.2023.107450 – ident: ref25 doi: 10.1109/ICASSP48485.2024.10446937 – ident: ref13 doi: 10.1109/TAFFC.2024.3357656 – ident: ref30 doi: 10.1109/TPAMI.2023.3268209 – ident: ref3 doi: 10.1109/TAFFC.2021.3134183 – ident: ref39 doi: 10.1109/TII.2022.3217120 – ident: ref38 doi: 10.1016/j.bspc.2022.103687 – ident: ref10 doi: 10.1016/j.inffus.2021.07.007 – ident: ref26 doi: 10.1109/tetci.2024.3406422 – ident: ref8 doi: 10.1109/TCE.2023.3325335 – ident: ref40 doi: 10.1109/TAFFC.2022.3189222 – ident: ref36 doi: 10.1109/TCDS.2019.2949306 |
| SSID | ssj0014528 |
| Score | 2.4335718 |
| Snippet | Multi-modal emotion recognition plays a vital role in the human-computer interaction (HCI) for consumer electronics. Nowadays, many studies have developed... |
| SourceID | proquest crossref ieee |
| SourceType | Aggregation Database Index Database Publisher |
| StartPage | 6777 |
| SubjectTerms | Adaptation Adaptation models Brain modeling Complexity theory Consumer electronics Data models deep learning domain adaptation Electroencephalography Emotion recognition Emotions Eye movements Feature extraction Human-computer interaction Human-computer interface low-rank fusion multi-modal Physiology Robustness Training |
| Title | Emotion Recognition Empowered Human-Computer Interaction With Domain Adaptation Network |
| URI | https://ieeexplore.ieee.org/document/10819001 https://www.proquest.com/docview/3247359654 |
| Volume | 71 |
| WOSCitedRecordID | wos001554477900014&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1558-4127 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014528 issn: 0098-3063 databaseCode: RIE dateStart: 19750101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV27TsMwFLWgYoCBZxGFgjywMKQ48Sseq5KKqUKoqN0i23FEh6ZVm_L9-JGgSoiBzUMiRffEvuf6Pg4Aj8jw2Li6sEQVMiKcxpF1CyqSBdXWw0iFZOnFJvhkks7n4q1pVve9MMYYX3xmBm7pc_nFSu_cVZnd4c5_uW6tQ85ZaNb6SRkQmqTtgEzLg3Gbk0TieTrKbCSYkIFlG4Q0-i-tD_KiKr9OYu9exmf__LBzcNrwSDgMwF-AA1NdgpO96YJXYJYFiR743hYJ2XW2XDtdNFNAf30ftaoO0F8Nhi4HOFvUn_BltZSLCg4LuQ7pejgJJeNd8DHOpqPXqNFRiHRCaB1JnSLOreGpDTaoMFgyVgpFdSmYxhpjjAQqNS2pUAlnFp5YEcVSVCpmGYDA16BTrSpzA6DBuCiN4kQKxwWFiqmWCBc0VZarKdoDT61l83UYl5H7MAOJ3KKQOxTyBoUe6DpL7j0XjNgD_RaLvNlQ29zyPo6pYJTc_vHaHThOnDavL0bsg0692Zl7cKS_6sV28-D_lW-3Rbw0 |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NS8MwGA4yBfXg58Tp1By8eOhMm6RpjmN2TJxFZLLdSpKmuMM-2Dp_v0naykA8eMuhhfI-Td7nzfvxAHCPNPO1rQsLZCY8wqjvGbcgPZFRZTyMkEjkTmyCJUk0mfC3qlnd9cJorV3xme7YpcvlZwu1sVdlZodb_2W7tXYpIQEq27V-kgaEBlE9ItMwYVxnJRF_HPViEwsGpGP4BiGVAkzthZysyq-z2DmY_vE_P-0EHFVMEnZL6E_Bjp6fgcOt-YLnYByXIj3wvS4TMut4trTKaDqD7gLfq3UdoLscLPsc4HhafMKnxUxM57CbiWWZsIdJWTTeBB_9eNQbeJWSgqcCQgtPqAgxZkxPTbhBucYiDHMuqcp5qLDCGCOOckVzymXAQgOQL4kMI5TL0HAAji9AY76Y60sANcZZriUjgls2yKVPlUA4o5E0bE3SFnioLZsuy4EZqQs0EE8NCqlFIa1QaIGmteTWc6URW6BdY5FWW2qdGubHMOUhJVd_vHYH9gej12E6fE5ersFBYJV6XWliGzSK1UbfgD31VUzXq1v333wDrv2_ew |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Emotion+Recognition+Empowered+Human-Computer+Interaction+With+Domain+Adaptation+Network&rft.jtitle=IEEE+transactions+on+consumer+electronics&rft.au=Xu%2C+Xu&rft.au=Fu%2C+Chong&rft.au=Chen%2C+Junxin&rft.date=2025-05-01&rft.pub=IEEE&rft.issn=0098-3063&rft.volume=71&rft.issue=2&rft.spage=6777&rft.epage=6786&rft_id=info:doi/10.1109%2FTCE.2024.3524401&rft.externalDocID=10819001 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0098-3063&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0098-3063&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0098-3063&client=summon |