An investigation of multimodal EMG-EEG fusion strategies for upper-limb gesture classification
. Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying electromyographic (EMG) muscular data or electroencephalographic (EEG) brain data for this purpose is often limited in methodological rigour, the extent to which...
Uložené v:
| Vydané v: | Journal of neural engineering Ročník 22; číslo 4 |
|---|---|
| Hlavní autori: | , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
England
IOP Publishing
01.08.2025
|
| Predmet: | |
| ISSN: | 1741-2560, 1741-2552, 1741-2552 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | . Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying electromyographic (EMG) muscular data or electroencephalographic (EEG) brain data for this purpose is often limited in methodological rigour, the extent to which generalisation is demonstrated, and the granularity of gestures classified. This work evaluates three architectures for multimodal fusion of EMG & EEG data in gesture classification, including a novel Hierarchical strategy, in both subject-specific and subject-independent settings.
. We propose an unbiased methodology for designing classifiers centred on Automated Machine Learning through Combined Algorithm Selection & Hyperparameter Optimisation (CASH); the first application of this technique to the biosignal domain. Using CASH, we introduce an end-to-end pipeline for data handling, algorithm development, modelling, and fair comparison, addressing established weaknesses among biosignal literature.
. EMG-EEG fusion is shown to provide significantly higher subject-independent accuracy in same-hand multi-gesture classification than an equivalent EMG classifier. Our CASH-based design methodology produces a more accurate subject-specific classifier design than recommended by literature. Our novel Hierarchical ensemble of classical models outperforms a domain-standard CNN architecture. We achieve a subject-independent EEG multiclass accuracy competitive with many subject-specific approaches used for similar, or more easily separable, problems.
. To our knowledge, this is the first work to establish a systematic framework for automatic, unbiased designing and testing of fusion architectures in the context of multimodal biosignal classification. We demonstrate a robust end-to-end modelling pipeline for biosignal classification problems which if adopted in future research can help address the risk of bias common in multimodal BCI studies , enabling more reliable and rigorous comparison of proposed classifiers than is usual in the domain. We apply the approach to a more complex task than typical of EMG-EEG fusion research, surpassing literature-recommended designs and verifying the efficacy of a novel Hierarchical fusion architecture. |
|---|---|
| AbstractList | . Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying electromyographic (EMG) muscular data or electroencephalographic (EEG) brain data for this purpose is often limited in methodological rigour, the extent to which generalisation is demonstrated, and the granularity of gestures classified. This work evaluates three architectures for multimodal fusion of EMG & EEG data in gesture classification, including a novel Hierarchical strategy, in both subject-specific and subject-independent settings.
. We propose an unbiased methodology for designing classifiers centred on Automated Machine Learning through Combined Algorithm Selection & Hyperparameter Optimisation (CASH); the first application of this technique to the biosignal domain. Using CASH, we introduce an end-to-end pipeline for data handling, algorithm development, modelling, and fair comparison, addressing established weaknesses among biosignal literature.
. EMG-EEG fusion is shown to provide significantly higher subject-independent accuracy in same-hand multi-gesture classification than an equivalent EMG classifier. Our CASH-based design methodology produces a more accurate subject-specific classifier design than recommended by literature. Our novel Hierarchical ensemble of classical models outperforms a domain-standard CNN architecture. We achieve a subject-independent EEG multiclass accuracy competitive with many subject-specific approaches used for similar, or more easily separable, problems.
. To our knowledge, this is the first work to establish a systematic framework for automatic, unbiased designing and testing of fusion architectures in the context of multimodal biosignal classification. We demonstrate a robust end-to-end modelling pipeline for biosignal classification problems which if adopted in future research can help address the risk of bias common in multimodal BCI studies , enabling more reliable and rigorous comparison of proposed classifiers than is usual in the domain. We apply the approach to a more complex task than typical of EMG-EEG fusion research, surpassing literature-recommended designs and verifying the efficacy of a novel Hierarchical fusion architecture. Objective: Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying electromyographic (EMG) muscular data or electroencephalographic (EEG) brain data for this purpose is often limited in methodological rigour, the extent to which generalisation is demonstrated, and the granularity of gestures classified. This work evaluates three architectures for multimodal fusion of EMG & EEG data in gesture classification, including a novel Hierarchical strategy, in both subject-specific and subject-independent settings. Approach: We propose an unbiased methodology for designing classifiers centred on Automated Machine Learning through Combined Algorithm Selection & Hyperparameter Optimisation (CASH); the first application of this technique to the biosignal domain. Using CASH, we introduce an end-to-end pipeline for data handling, algorithm development, modelling, and fair comparison, addressing established weaknesses among biosignal literature. Main results: EMG-EEG fusion is shown to provide significantly higher subject-independent accuracy in same-hand multi-gesture classification than an equivalent EMG classifier. Our CASH-based design methodology produces a more accurate subject-specific classifier design than recommended by literature. Our novel Hierarchical ensemble of classical models outperforms a domain-standard CNN architecture. We achieve a subject-independent EEG multiclass accuracy competitive with many subject-specific approaches used for similar, or more easily separable, problems. Significance: To our knowledge, this is the first work to establish a systematic framework for automatic, unbiased designing and testing of fusion architectures in the context of multimodal biosignal classification. We demonstrate a robust end-to-end modelling pipeline for biosignal classification problems which if adopted in future research can help address the risk of bias common in multimodal BCI studies, enabling more reliable and rigorous comparison of proposed classifiers than is usual in the domain. We apply the approach to a more complex task than typical of EMG-EEG fusion research, surpassing literature-recommended designs and verifying the efficacy of a novel Hierarchical fusion architecture.Objective: Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying electromyographic (EMG) muscular data or electroencephalographic (EEG) brain data for this purpose is often limited in methodological rigour, the extent to which generalisation is demonstrated, and the granularity of gestures classified. This work evaluates three architectures for multimodal fusion of EMG & EEG data in gesture classification, including a novel Hierarchical strategy, in both subject-specific and subject-independent settings. Approach: We propose an unbiased methodology for designing classifiers centred on Automated Machine Learning through Combined Algorithm Selection & Hyperparameter Optimisation (CASH); the first application of this technique to the biosignal domain. Using CASH, we introduce an end-to-end pipeline for data handling, algorithm development, modelling, and fair comparison, addressing established weaknesses among biosignal literature. Main results: EMG-EEG fusion is shown to provide significantly higher subject-independent accuracy in same-hand multi-gesture classification than an equivalent EMG classifier. Our CASH-based design methodology produces a more accurate subject-specific classifier design than recommended by literature. Our novel Hierarchical ensemble of classical models outperforms a domain-standard CNN architecture. We achieve a subject-independent EEG multiclass accuracy competitive with many subject-specific approaches used for similar, or more easily separable, problems. Significance: To our knowledge, this is the first work to establish a systematic framework for automatic, unbiased designing and testing of fusion architectures in the context of multimodal biosignal classification. We demonstrate a robust end-to-end modelling pipeline for biosignal classification problems which if adopted in future research can help address the risk of bias common in multimodal BCI studies, enabling more reliable and rigorous comparison of proposed classifiers than is usual in the domain. We apply the approach to a more complex task than typical of EMG-EEG fusion research, surpassing literature-recommended designs and verifying the efficacy of a novel Hierarchical fusion architecture. |
| Author | Goldingay, Harry Pritchard, Michael Campelo, Felipe |
| Author_xml | – sequence: 1 givenname: Michael orcidid: 0000-0002-3783-0230 surname: Pritchard fullname: Pritchard, Michael organization: Aston University Department of Applied AI and Robotics, B4 7ET Birmingham, United Kingdom – sequence: 2 givenname: Felipe orcidid: 0000-0001-8432-4325 surname: Campelo fullname: Campelo, Felipe organization: University of Bristol School of Engineering Mathematics and Technology, BS8 1QU Bristol, United Kingdom – sequence: 3 givenname: Harry orcidid: 0000-0001-6402-937X surname: Goldingay fullname: Goldingay, Harry organization: Aston University Aston Centre for Artificial Intelligence Research and Application, B4 7ET Birmingham, United Kingdom |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/40480249$$D View this record in MEDLINE/PubMed |
| BookMark | eNo9kL1PwzAQxS1URD9gZ0IeGQi1YztxxqoKBamIpTOWG9uVqyQOcYzEf49DS6c73T09vfebg0nrWg3APUbPGHG-xDnFScpYupRKY1NcgdnlNLnsGZqCufdHhAjOC3QDphRRjlJazMDnqoW2_dZ-sAc5WNdCZ2AT6sE2Tskalu-bpCw30AQ_Pv3Qy0EfrPbQuB6GrtN9UttmDw_RIvQaVrX03hpb_bndgmsja6_vznMBdi_lbv2abD82b-vVNqkIIUPCmUKGsJzGFgUzihTUVDEqozwnOdPxwlWa7VGVsUwxlbPK8FhZGkO0zMkCPJ5su959hZhENNZXuq5lq13wgqQ4ywpGUx6lD2dp2Ddaia63jex_xD-SKHg6CazrxNGFvo3BBUZiJC5GpGLEK07EyS8wX3RX |
| CODEN | JNEOBH |
| ContentType | Journal Article |
| Copyright | 2025 The Author(s). Published by IOP Publishing Ltd Creative Commons Attribution license. |
| Copyright_xml | – notice: 2025 The Author(s). Published by IOP Publishing Ltd – notice: Creative Commons Attribution license. |
| DBID | O3W TSCCA CGR CUY CVF ECM EIF NPM 7X8 |
| DOI | 10.1088/1741-2552/ade1f9 |
| DatabaseName | Institute of Physics Open Access Journal Titles IOPscience (Open Access) Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic |
| DatabaseTitle | MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: O3W name: Institute of Physics Open Access Journal Titles url: http://iopscience.iop.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Anatomy & Physiology |
| EISSN | 1741-2552 |
| ExternalDocumentID | 40480249 jneade1f9 |
| Genre | Journal Article |
| GrantInformation_xml | – fundername: Engineering and Physical Sciences Research Council grantid: EP/V036106/1 funderid: http://dx.doi.org/10.13039/501100000266 |
| GroupedDBID | --- 1JI 4.4 53G 5B3 5GY 5VS 5ZH 7.M 7.Q AAGCD AAJIO AAJKP AATNI ABHWH ABJNI ABQJV ABVAM ACAFW ACGFS ACHIP ADEQX AEFHF AENEX AFYNE AKPSB ALMA_UNASSIGNED_HOLDINGS AOAED ASPBG ATQHT AVWKF AZFZN CEBXE CJUJL CRLBU CS3 DU5 EBS EDWGO EMSAF EPQRW EQZZN F5P IHE IJHAN IOP IZVLO KOT LAP N5L N9A O3W P2P PJBAE RIN RO9 ROL RPA SY9 TSCCA W28 XPP CGR CUY CVF ECM EIF NPM 7X8 AEINN |
| ID | FETCH-LOGICAL-c333t-85d0f3574de195fd394fc7905487375efd38d26b0c656d5d75cf8adeaff3ea73 |
| IEDL.DBID | O3W |
| ISICitedReferencesCount | 0 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001525761400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1741-2560 1741-2552 |
| IngestDate | Fri Sep 05 15:54:47 EDT 2025 Sat Jul 12 02:49:04 EDT 2025 Wed Jul 16 05:29:58 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 4 |
| Keywords | biosignal fusion brain-computer-interface automated machine learning multimodal gesture classification |
| Language | English |
| License | Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 license. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. Creative Commons Attribution license. |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c333t-85d0f3574de195fd394fc7905487375efd38d26b0c656d5d75cf8adeaff3ea73 |
| Notes | JNE-108748.R1 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| ORCID | 0000-0002-3783-0230 0000-0001-8432-4325 0000-0001-6402-937X |
| OpenAccessLink | https://iopscience.iop.org/article/10.1088/1741-2552/ade1f9 |
| PMID | 40480249 |
| PQID | 3216695428 |
| PQPubID | 23479 |
| PageCount | 20 |
| ParticipantIDs | pubmed_primary_40480249 proquest_miscellaneous_3216695428 iop_journals_10_1088_1741_2552_ade1f9 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-08-01 |
| PublicationDateYYYYMMDD | 2025-08-01 |
| PublicationDate_xml | – month: 08 year: 2025 text: 2025-08-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | England |
| PublicationPlace_xml | – name: England |
| PublicationTitle | Journal of neural engineering |
| PublicationTitleAbbrev | JNE |
| PublicationTitleAlternate | J. Neural Eng |
| PublicationYear | 2025 |
| Publisher | IOP Publishing |
| Publisher_xml | – name: IOP Publishing |
| SSID | ssj0031790 |
| Score | 2.430186 |
| Snippet | . Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying electromyographic... Objective: Upper-limb gesture identification is an important problem in the advancement of robotic prostheses. Prevailing research into classifying... |
| SourceID | proquest pubmed iop |
| SourceType | Aggregation Database Index Database Publisher |
| SubjectTerms | Adult Algorithms automated machine learning biosignal fusion Brain-Computer Interfaces brain-computer-interface Electroencephalography - classification Electroencephalography - methods Electromyography - classification Electromyography - methods Female Gestures Humans Machine Learning Male multimodal gesture classification Upper Extremity - physiology Young Adult |
| Title | An investigation of multimodal EMG-EEG fusion strategies for upper-limb gesture classification |
| URI | https://iopscience.iop.org/article/10.1088/1741-2552/ade1f9 https://www.ncbi.nlm.nih.gov/pubmed/40480249 https://www.proquest.com/docview/3216695428 |
| Volume | 22 |
| WOSCitedRecordID | wos001525761400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LS8QwEB58Hbz4fqwvIqi3aNs0bYqnRVa9uHoQ3JOhzQMW3HbZ3Qr-eydpVy8KgpdSSjJJJtPkm8wjAGfWaG2TwlBhIlRQbBDTPLEpDXPEG5YbnSXKXzaR9vtiMMieFuD6KxamGrdL_yW-NomCGxa2DnHiCjF0SBEJR1e5NqHNFmGZCc6dP98je5kvw8ylnmqiIV3pJGhtlD9RwH0FG_sdY_q95nb9X73cgLUWYpJuU3QTFky5BdvdEtXr0Qe5IN7p05-mb8NrtyTD71QbVUkqS7yT4ajSSKT3cEd7vTtia3eqRqazeWYJgmCX1OOxmdC34aggzkxVTwxRDo079yNPbQeeb3vPN_e0vXKBKsbYjAquA8t4GmOfM241y2KrXA4v1GtYyg1-ETpKikAhDtRcp1xZgQPMrWUmT9kuLJVVafaBhIXNC5SEnAm3LiQijpXCukFhsQYLO3COrJPtHzOV3hguhHR8k45vsuFbB07nsyJR8p05Iy9NVU8li8IkyTjqTx3Ya6ZLjpsUHTJ2ofKoWR78sZVDWI3cxb7es-8IlmaT2hzDinqfDaeTE1hMBwKf_aeHEy9kn4s50js |
| linkProvider | IOP Publishing |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1La9wwEB6SbQm9NG2TNJtHq0Lbm7K25Yd8XJLdtLTd5hBoThW2pIGFrG1214H8-45kL70kEMjNGI8kj17faEbfAHxGawympeXSRmSgYBDzIsWMhwXhDUysyVPtk01ks5m8ucmv-jyn_i5M3fRL_xk9dkTBnQr7gDg5IgwdckLC0agwNsR81BjchheOp8SlMPgt_myWYuHop7obkU4iDXo_5UOl0N5CFT6OM_1-M919dkvfwOsearJx9_lb2LLVO9gbV2RmL-7ZV-aDP_2p-h78HVds_p9yo65YjcwHGy5qQ4VMfl3yyeSSYetO19hqvWGYYAR6Wds0dslv54uSOXdVu7RMO1TuwpB8aftwPZ1cn3_jfeoFroUQay4TE6BIspjanSdoRB6jdlxeZN-ILLH0RpooLQNNeNAkJks0SvrJAlHYIhMHMKjqyh4CC0ssShoRhZBufUhlHGtNskGJJCHCIXwh9al-5qyUd4pLqZzulNOd6nQ3hE-bnlE0A5xbo6hs3a6UiMI0zROyo4bwvusy1XRUHSp2V-bJwjx6Yi0fYefqYqp-fp_9OIZXkcv164P9TmCwXrb2FF7qu_V8tfzgx9k_Zd3Vpg |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=An+investigation+of+multimodal+EMG-EEG+fusion+strategies+for+upper-limb+gesture+classification&rft.jtitle=Journal+of+neural+engineering&rft.au=Pritchard%2C+Michael&rft.au=Campelo%2C+Felipe&rft.au=Goldingay%2C+Harry&rft.date=2025-08-01&rft.pub=IOP+Publishing&rft.issn=1741-2560&rft.eissn=1741-2552&rft.volume=22&rft.issue=4&rft_id=info:doi/10.1088%2F1741-2552%2Fade1f9&rft.externalDocID=jneade1f9 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1741-2560&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1741-2560&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1741-2560&client=summon |