A Muscle Synergy-Inspired Method of Detecting Human Movement Intentions Based on Wearable Sensor Fusion

Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between different locomotion modes. In this study, we enhanced a muscle synergy-inspired method of locomotion mode identification by fusing the electrom...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING Ročník 29; s. 1089 - 1098
Hlavní autoři: Liu, Yi-Xing, Wang, Ruoli, Gutierrez-Farewik, Elena M.
Médium: Journal Article Publikace
Jazyk:angličtina
Vydáno: New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1534-4320, 1558-0210, 1558-0210
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between different locomotion modes. In this study, we enhanced a muscle synergy-inspired method of locomotion mode identification by fusing the electromyography data with two types of data from wearable sensors (inertial measurement units), namely linear acceleration and angular velocity. From the finite state machine perspective, the enhanced method was used to systematically identify 2 static modes, 7 dynamic modes, and 27 transitions among them. In addition to the five broadly studied modes (level ground walking, ramps ascent/descent, stairs ascent/descent), we identified the transition between different walking speeds and modes of ramp walking at different inclination angles. Seven combinations of sensor fusion were conducted, on experimental data from 8 able-bodied adult subjects, and their classification accuracy and prediction time were compared. Prediction based on a fusion of electromyography and gyroscope (angular velocity) data predicted transitions earlier and with higher accuracy. All transitions and modes were identified with a total average classification accuracy of 94.5% with fused sensor data. For nearly all transitions, we were able to predict the next locomotion mode 300-500ms prior to the step into that mode.
AbstractList Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between different locomotion modes. In this study, we enhanced a muscle synergy-inspired method of locomotion mode identification by fusing the electromyography data with two types of data from wearable sensors (inertial measurement units), namely linear acceleration and angular velocity. From the finite state machine perspective, the enhanced method was used to systematically identify 2 static modes, 7 dynamic modes, and 27 transitions among them. In addition to the five broadly studied modes (level ground walking, ramps ascent/descent, stairs ascent/descent), we identified the transition between different walking speeds and modes of ramp walking at different inclination angles. Seven combinations of sensor fusion were conducted, on experimental data from 8 able-bodied adult subjects, and their classification accuracy and prediction time were compared. Prediction based on a fusion of electromyography and gyroscope (angular velocity) data predicted transitions earlier and with higher accuracy. All transitions and modes were identified with a total average classification accuracy of 94.5% with fused sensor data. For nearly all transitions, we were able to predict the next locomotion mode 300-500ms prior to the step into that mode.Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between different locomotion modes. In this study, we enhanced a muscle synergy-inspired method of locomotion mode identification by fusing the electromyography data with two types of data from wearable sensors (inertial measurement units), namely linear acceleration and angular velocity. From the finite state machine perspective, the enhanced method was used to systematically identify 2 static modes, 7 dynamic modes, and 27 transitions among them. In addition to the five broadly studied modes (level ground walking, ramps ascent/descent, stairs ascent/descent), we identified the transition between different walking speeds and modes of ramp walking at different inclination angles. Seven combinations of sensor fusion were conducted, on experimental data from 8 able-bodied adult subjects, and their classification accuracy and prediction time were compared. Prediction based on a fusion of electromyography and gyroscope (angular velocity) data predicted transitions earlier and with higher accuracy. All transitions and modes were identified with a total average classification accuracy of 94.5% with fused sensor data. For nearly all transitions, we were able to predict the next locomotion mode 300-500ms prior to the step into that mode.
Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between different locomotion modes. In this study, we enhanced a muscle synergy-inspired method of locomotion mode identification by fusing the electromyography data with two types of data from wearable sensors (inertial measurement units), namely linear acceleration and angular velocity. From the finite state machine perspective, the enhanced method was used to systematically identify 2 static modes, 7 dynamic modes, and 27 transitions among them. In addition to the five broadly studied modes (level ground walking, ramps ascent/descent, stairs ascent/descent), we identified the transition between different walking speeds and modes of ramp walking at different inclination angles. Seven combinations of sensor fusion were conducted, on experimental data from 8 able-bodied adult subjects, and their classification accuracy and prediction time were compared. Prediction based on a fusion of electromyography and gyroscope (angular velocity) data predicted transitions earlier and with higher accuracy. All transitions and modes were identified with a total average classification accuracy of 94.5% with fused sensor data. For nearly all transitions, we were able to predict the next locomotion mode 300-500ms prior to the step into that mode.
Author Liu, Yi-Xing
Wang, Ruoli
Gutierrez-Farewik, Elena M.
Author_xml – sequence: 1
  givenname: Yi-Xing
  orcidid: 0000-0002-4679-2934
  surname: Liu
  fullname: Liu, Yi-Xing
  organization: Department of Engineering Mechanics, KTH MoveAbility Lab, KTH Royal Institute of Technology, Stockholm, Sweden
– sequence: 2
  givenname: Ruoli
  orcidid: 0000-0002-2232-5258
  surname: Wang
  fullname: Wang, Ruoli
  organization: Department of Engineering Mechanics, KTH MoveAbility Lab, KTH Royal Institute of Technology, Stockholm, Sweden
– sequence: 3
  givenname: Elena M.
  orcidid: 0000-0001-5417-5939
  surname: Gutierrez-Farewik
  fullname: Gutierrez-Farewik, Elena M.
  email: lanie@kth.se
  organization: Department of Engineering Mechanics, KTH MoveAbility Lab, KTH Royal Institute of Technology, Stockholm, Sweden
BackLink https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298904$$DView record from Swedish Publication Index (Kungliga Tekniska Högskolan)
http://kipublications.ki.se/Default.aspx?queryparsed=id:$$DView record from Swedish Publication Index (Karolinska Institutet)
BookMark eNp9kU1v1DAQhiNURD_gD8DFEhcuWfyZ2Melnyt1QaIrOFpOMtm6zdqL7YD239dhFw499DSj0fOOZt73tDhy3kFRvCd4RghWn1df775fziimZMawrAkTr4oTIoQs8wgfTT3jJWcUHxenMT5gTOpK1G-KY8axqisiTor1HC3H2A6A7nYOwnpXLlzc2gAdWkK69x3yPbqABG2ybo1uxo1xaOl_wwZcQguXcrHeRfTFxKzxDv0EE0wzLQQXfUBXY8zA2-J1b4YI7w71rFhdXa7Ob8rbb9eL8_lt2XIuU9krhrsWc8GhUkZVDSdNiwlVXVNRSmRnOOmF6Hmt6k4y2nIFFRay6uv8M2VnRblfG__Admz0NtiNCTvtjdWH0WPuQFdCMSVe5C_sj7n2Ya0f072mSirMM_9pz2-D_zVCTHpjYwvDYBz4MWoqmMrek7rO6Mdn6IMfg8vPZ4oTXgkqcabknmqDjzFAr1ubzGRpCsYOmmA9Za3_Zq2nrPUh6yylz6T_rn9R9GEvsgDwX6Cy-YRL9gTKZLSW
CODEN ITNSB3
CitedBy_id crossref_primary_10_1109_TNSRE_2022_3176410
crossref_primary_10_1088_1741_2552_adfab3
crossref_primary_10_1109_TNSRE_2023_3336360
crossref_primary_10_1007_s41315_024_00334_1
crossref_primary_10_1111_exsy_13659
crossref_primary_10_1109_JSEN_2023_3255255
crossref_primary_10_1007_s10489_022_03823_7
crossref_primary_10_1002_aisy_202300318
crossref_primary_10_1109_TNSRE_2025_3552530
crossref_primary_10_3390_s21227473
crossref_primary_10_1109_JBHI_2024_3441600
crossref_primary_10_1109_JBHI_2024_3462826
crossref_primary_10_3390_s23031643
crossref_primary_10_1097_WNO_0000000000001926
crossref_primary_10_1109_TFUZZ_2022_3158727
crossref_primary_10_1109_JBHI_2024_3497658
crossref_primary_10_1109_TBME_2022_3208381
crossref_primary_10_3390_biomimetics8060471
Cites_doi 10.3389/fnhum.2018.00004
10.1016/j.medengphy.2016.12.011
10.1007/s10439-015-1407-3
10.1186/s12984-015-0015-7
10.1109/TBME.2008.2003293
10.1109/TNSRE.2019.2950309
10.1109/TBME.2012.2208746
10.1109/TMECH.2017.2755048
10.1016/j.gaitpost.2012.07.013
10.1109/TBME.2009.2034734
10.1109/TBME.2012.2208641
10.1007/s13246-019-00767-0
10.1109/EMBC.2018.8513322
10.3390/s19204447
10.1186/s12984-017-0258-6
10.1109/TBME.2016.2538296
10.1109/TRO.2009.2019782
10.3390/sym9080147
10.1109/TBME.2011.2161671
10.1109/TNSRE.2010.2087360
10.1109/TNSRE.2016.2585962
10.1038/s41598-018-24332-z
10.1109/IROS.2015.7354261
10.1016/j.jbiomech.2017.05.010
10.1016/j.clinph.2010.07.010
10.1109/TMECH.2020.2990668
10.1109/EMBC.2014.6944518
10.3389/fnbot.2017.00015
10.1109/TNSRE.2015.2413393
10.1111/aor.13153
10.1088/1741-2560/11/5/056021
10.3389/fnhum.2015.00048
10.1109/TNSRE.2014.2346193
10.1016/j.neunet.2008.03.006
ContentType Journal Article
Publication
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
ADTPV
AOWAS
D8V
BZJLE
D8T
STUKM
DOI 10.1109/TNSRE.2021.3087135
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
SwePub
SwePub Articles
SWEPUB Kungliga Tekniska Högskolan
SwePub Other
SWEPUB Freely available online
SwePub Other full text
DatabaseTitle CrossRef
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Neurosciences Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
Materials Research Database


Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Occupational Therapy & Rehabilitation
EISSN 1558-0210
EndPage 1098
ExternalDocumentID oai_swepub_ki_se_659395
oai_DiVA_org_kth_298904
10_1109_TNSRE_2021_3087135
9448148
Genre orig-research
GrantInformation_xml – fundername: Promobilia Foundation
  grantid: 18200
  funderid: 10.13039/100009389
– fundername: Swedish Research Council (ref 2018-00750 BADASS: BiomechAnics in motion Disorders and ASSistance)
  funderid: 10.13039/501100004359
GroupedDBID ---
-~X
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAFWJ
AAJGR
AASAJ
AAWTH
ABAZT
ABVLG
ACGFO
ACGFS
ACIWK
ACPRK
AENEX
AETIX
AFPKN
AFRAH
AGSQL
AIBXA
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
ESBDL
F5P
GROUPED_DOAJ
HZ~
H~9
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
OK1
P2P
RIA
RIE
RNS
AAYXX
CITATION
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
ADTPV
AOWAS
D8V
BZJLE
D8T
STUKM
ID FETCH-LOGICAL-c448t-f930dc0454e69a96b41bc0129db62218da41f55f4797d832c49e60586f702123
IEDL.DBID RIE
ISICitedReferencesCount 29
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000663505900005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1534-4320
1558-0210
IngestDate Tue Nov 25 03:34:05 EST 2025
Tue Nov 04 16:36:59 EST 2025
Fri Jul 11 09:18:09 EDT 2025
Mon Jul 14 10:32:59 EDT 2025
Sat Nov 29 01:47:12 EST 2025
Tue Nov 18 22:32:08 EST 2025
Wed Aug 27 02:50:50 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c448t-f930dc0454e69a96b41bc0129db62218da41f55f4797d832c49e60586f702123
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-5417-5939
0000-0002-2232-5258
0000-0002-4679-2934
OpenAccessLink https://ieeexplore.ieee.org/document/9448148
PMID 34097615
PQID 2541465280
PQPubID 85423
PageCount 10
ParticipantIDs proquest_miscellaneous_2539210177
swepub_primary_oai_DiVA_org_kth_298904
proquest_journals_2541465280
crossref_primary_10_1109_TNSRE_2021_3087135
crossref_citationtrail_10_1109_TNSRE_2021_3087135
ieee_primary_9448148
swepub_primary_oai_swepub_ki_se_659395
PublicationCentury 2000
PublicationDate 20210000
2021-00-00
20210101
2021
PublicationDateYYYYMMDD 2021-01-01
PublicationDate_xml – year: 2021
  text: 20210000
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING
PublicationTitleAbbrev TNSRE
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
ref13
ref34
ref12
ref14
ref30
ref33
ref11
ref32
ref10
ref2
ref1
ref17
ref16
ref19
ref18
ref24
ref23
ref26
ref25
ref20
ref22
ref21
bu (ref15) 2009; 25
ref28
ref27
ref29
ref8
ref7
ref9
ref4
ref3
ref6
ref5
hermens (ref31) 1999
References_xml – ident: ref11
  doi: 10.3389/fnhum.2018.00004
– ident: ref16
  doi: 10.1016/j.medengphy.2016.12.011
– ident: ref22
  doi: 10.1007/s10439-015-1407-3
– ident: ref21
  doi: 10.1186/s12984-015-0015-7
– ident: ref7
  doi: 10.1109/TBME.2008.2003293
– ident: ref25
  doi: 10.1109/TNSRE.2019.2950309
– year: 1999
  ident: ref31
  publication-title: European Recommendations for Surface Electromyography Results of the Seniam Project
– ident: ref8
  doi: 10.1109/TBME.2012.2208746
– ident: ref26
  doi: 10.1109/TMECH.2017.2755048
– ident: ref34
  doi: 10.1016/j.gaitpost.2012.07.013
– ident: ref29
  doi: 10.1109/TBME.2009.2034734
– ident: ref19
  doi: 10.1109/TBME.2012.2208641
– ident: ref14
  doi: 10.1007/s13246-019-00767-0
– ident: ref17
  doi: 10.1109/EMBC.2018.8513322
– ident: ref18
  doi: 10.3390/s19204447
– ident: ref1
  doi: 10.1186/s12984-017-0258-6
– ident: ref3
  doi: 10.1109/TBME.2016.2538296
– volume: 25
  start-page: 502
  year: 2009
  ident: ref15
  article-title: A hybrid motion classification approach for EMG-based human-robot interfaces using Bayesian and neural networks
  publication-title: IEEE Trans Robot
  doi: 10.1109/TRO.2009.2019782
– ident: ref23
  doi: 10.3390/sym9080147
– ident: ref9
  doi: 10.1109/TBME.2011.2161671
– ident: ref30
  doi: 10.1109/TNSRE.2010.2087360
– ident: ref10
  doi: 10.1109/TNSRE.2016.2585962
– ident: ref13
  doi: 10.1038/s41598-018-24332-z
– ident: ref33
  doi: 10.1109/IROS.2015.7354261
– ident: ref35
  doi: 10.1016/j.jbiomech.2017.05.010
– ident: ref4
  doi: 10.1016/j.clinph.2010.07.010
– ident: ref27
  doi: 10.1109/TMECH.2020.2990668
– ident: ref20
  doi: 10.1109/EMBC.2014.6944518
– ident: ref28
  doi: 10.3389/fnbot.2017.00015
– ident: ref6
  doi: 10.1109/TNSRE.2015.2413393
– ident: ref24
  doi: 10.1111/aor.13153
– ident: ref5
  doi: 10.1088/1741-2560/11/5/056021
– ident: ref12
  doi: 10.3389/fnhum.2015.00048
– ident: ref32
  doi: 10.1109/TNSRE.2014.2346193
– ident: ref2
  doi: 10.1016/j.neunet.2008.03.006
SSID ssj0017657
Score 2.4779906
Snippet Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between...
SourceID swepub
proquest
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1089
SubjectTerms Acceleration
Accuracy
Angular velocity
Ascent
Classification
Electromyography
Exoskeleton
Exoskeletons
Feature extraction
Finite state machines
Human motion
Inclination angle
Inertial platforms
Inertial sensing devices
Intent recognition
Legged locomotion
Locomotion
locomotion modes identification
Mechanical sensors
Motion perception
Multisensor fusion
muscle synergies
Muscles
Robot control
robotic exoskeletons
sensor fusion
Sensors
Stairs
Stairways
Velocity
Walking
Wearable technology
Title A Muscle Synergy-Inspired Method of Detecting Human Movement Intentions Based on Wearable Sensor Fusion
URI https://ieeexplore.ieee.org/document/9448148
https://www.proquest.com/docview/2541465280
https://www.proquest.com/docview/2539210177
https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298904
http://kipublications.ki.se/Default.aspx?queryparsed=id
Volume 29
WOSCitedRecordID wos000663505900005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 1558-0210
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0017657
  issn: 1558-0210
  databaseCode: DOA
  dateStart: 20210101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1558-0210
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0017657
  issn: 1558-0210
  databaseCode: RIE
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED9tEw-8MGAgAqMyEuwFsiWxE8ePha0CiVZorUbfrNSxYRpKpjZB4r_nzkmjTpqQeIuSc2Tr7uz78P0O4G1eOIUT5KEonUQHpUhDtMIL1CujrBEqyT0cw9VXOZvly6X6tgcfhloYa62_fGZP6dHn8svatBQqO1PoS6D5vg_7UsquVmvIGMjMo3qiAotQ8CTaFshE6mwxm19eoCuYxKeEfxdzalfDCegpo264O-eRb7By19bcxQ_1Z87k8P9m-xge9bYlG3fC8AT2bPUU3u3iCLNFByLATtjlHYjuI_gxZtN2g-PY_I-vBwy_VJSFtyWb-i7TrHbs3FLOAU875oP_bFp7uPGG-ZvwXoTZRzwXkbhi31GJqDCLzdFVrtds0lJk7hksJheLT5_DvgtDaHABTegUj0pDSH02U4XKViJeGQpflassQQOhLETs0tQJqWSJ-wMy2FKuNXOS4OP5czio6sq-AOZyw5PSRS5WTiiJhg1Zp2VqM8u5UHkA8ZYV2vTLp0YZv7T3VCKlPSc1cVL3nAzg_TDmtsPn-Cf1EfFpoOxZFMDxluO6V-GNTqhBepYmeRTAm-EzKh9lVIrK1i3RoHlJm5oM4KSTlOHfhNt9fn011igU-qb5qQnrPhL3E_avbvDJ6ixVXKUv75_rK3hIK-qiQMdw0Kxb-xoemN_N9WY9Qs1Y5iMfWBh5_fgLP9sKHg
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED-NgQQvfI2JwAAjwV4gWxI7TvxY2KpNtBVao7E3K01smDYlU5sg8d9z56RRJ01IvEXJObJ1d_Z9-H4H8CHNrcIJcl-UNkEHJY99tMJz1KtCmUKoKHVwDOeTZDZLLy7U9y34PNTCGGPc5TNzQI8ul1_WRUuhskOFvgSa7_fgfixEFHbVWkPOIJEO1xNVWPiCR8G6RCZQh9lsfnaMzmAUHhACXsipYQ0nqCdJ_XA3TiTXYuW2tbmJIOpOnfGT_5vvU3jcW5ds1InDM9gy1XP4uIkkzLIORoDts7NbIN078HPEpu0Kx7H5H1cR6J9WlIc3JZu6PtOstuzIUNYBzzvmwv9sWjvA8Ya5u_BOiNkXPBmRuGI_UI2oNIvN0Vmul2zcUmzuBWTj4-zrid_3YfALXEDjW8WDsiCsPiNVruRChIuCAljlQkZoIpS5CG0cW5GopMQdAllsKNsqbUIA8nwXtqu6Mi-B2bTgUWkDGyorVIKmDdmnZWyk4Vyo1INwzQpd9MunVhnX2vkqgdKOk5o4qXtOevBpGHPTIXT8k3qH-DRQ9izyYG_Ncd0r8UpH1CJdxlEaePB--IzqRzmVvDJ1SzRoYNK2lniw30nK8G9C7j66PB9pFAp91fzShHYfiLsJ-1dX-GS0jBVX8au75_oOHp5k04menM6-vYZHtLouJrQH282yNW_gQfG7uVwt3zr9-AuvrAuI
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Muscle+Synergy-Inspired+Method+of+Detecting+Human+Movement+Intentions+Based+on+Wearable+Sensor+Fusion&rft.jtitle=IEEE+transactions+on+neural+systems+and+rehabilitation+engineering&rft.au=Liu%2C+Yi-Xing&rft.au=Wang%2C+Ruoli&rft.au=Gutierrez-Farewik%2C+Elena+M.&rft.date=2021&rft.pub=IEEE&rft.issn=1534-4320&rft.volume=29&rft.spage=1089&rft.epage=1098&rft_id=info:doi/10.1109%2FTNSRE.2021.3087135&rft_id=info%3Apmid%2F34097615&rft.externalDocID=9448148
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1534-4320&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1534-4320&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1534-4320&client=summon