Plug-and-Play Gesture Control Using Muscle and Motion Sensors
As the capacity for machines to extend human capabilities continues to grow, the communication channels used must also expand. Allowing machines to interpret nonverbal commands such as gestures can help make interactions more similar to interactions with another person. Yet to be pervasive and effec...
Uložené v:
| Vydané v: | 2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI) s. 439 - 448 |
|---|---|
| Hlavní autori: | , |
| Médium: | Konferenčný príspevok.. |
| Jazyk: | English |
| Vydavateľské údaje: |
New York, NY, USA
ACM
09.03.2020
|
| Edícia: | ACM Conferences |
| Predmet: | |
| ISBN: | 1450367461, 9781450367462 |
| ISSN: | 2167-2148 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | As the capacity for machines to extend human capabilities continues to grow, the communication channels used must also expand. Allowing machines to interpret nonverbal commands such as gestures can help make interactions more similar to interactions with another person. Yet to be pervasive and effective in realistic scenarios, such interfaces should not require significant sensing infrastructure or per-user setup time. The presented work takes a step towards these goals by using wearable muscle and motion sensors to detect gestures without dedicated calibration or training procedures. An algorithm is presented for clustering unlabeled streaming data in real time, and it is applied to adaptively thresholding muscle and motion signals acquired via electromyography (EMG) and an inertial measurement unit (IMU). This enables plug-and-play online detection of arm stiffening, fist clenching, rotation gestures, and forearm activation. It also augments a neural network pipeline, trained only on strategically chosen training data from previous users, to detect left, right, up, and down gestures. Together, these pipelines offer a plug-and-play gesture vocabulary suitable for remotely controlling a robot. Experiments with 6 subjects evaluate classifier performance and interface efficacy. Classifiers correctly identified 97.6% of 1,200 cued gestures, and a drone correctly responded to 81.6% of 1,535 unstructured gestures as subjects remotely controlled it through target hoops during 119 minutes of total flight time. |
|---|---|
| AbstractList | As the capacity for machines to extend human capabilities continues to grow, the communication channels used must also expand. Allowing machines to interpret nonverbal commands such as gestures can help make interactions more similar to interactions with another person. Yet to be pervasive and effective in realistic scenarios, such interfaces should not require significant sensing infrastructure or per-user setup time. The presented work takes a step towards these goals by using wearable muscle and motion sensors to detect gestures without dedicated calibration or training procedures. An algorithm is presented for clustering unlabeled streaming data in real time, and it is applied to adaptively thresholding muscle and motion signals acquired via electromyography (EMG) and an inertial measurement unit (IMU). This enables plug-and-play online detection of arm stiffening, fist clenching, rotation gestures, and forearm activation. It also augments a neural network pipeline, trained only on strategically chosen training data from previous users, to detect left, right, up, and down gestures. Together, these pipelines offer a plug-and-play gesture vocabulary suitable for remotely controlling a robot. Experiments with 6 subjects evaluate classifier performance and interface efficacy. Classifiers correctly identified 97.6% of 1,200 cued gestures, and a drone correctly responded to 81.6% of 1,535 unstructured gestures as subjects remotely controlled it through target hoops during 119 minutes of total flight time. CCS CONCEPTS * Human-centered computing → Human computer interaction (HCI); Gestural input; * Computer systems organization →Robotics; * Computing methodologies→Machine learning. ACM Reference Format: Joseph DelPreto and Daniela Rus. 2020. Plug-and-Play Gesture Control Using Muscle and Motion Sensors. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI '20), March 23-26, 2020, Cambridge, United Kingdom. ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/3319502.3374823 As the capacity for machines to extend human capabilities continues to grow, the communication channels used must also expand. Allowing machines to interpret nonverbal commands such as gestures can help make interactions more similar to interactions with another person. Yet to be pervasive and effective in realistic scenarios, such interfaces should not require significant sensing infrastructure or per-user setup time. The presented work takes a step towards these goals by using wearable muscle and motion sensors to detect gestures without dedicated calibration or training procedures. An algorithm is presented for clustering unlabeled streaming data in real time, and it is applied to adaptively thresholding muscle and motion signals acquired via electromyography (EMG) and an inertial measurement unit (IMU). This enables plug-and-play online detection of arm stiffening, fist clenching, rotation gestures, and forearm activation. It also augments a neural network pipeline, trained only on strategically chosen training data from previous users, to detect left, right, up, and down gestures. Together, these pipelines offer a plug-and-play gesture vocabulary suitable for remotely controlling a robot. Experiments with 6 subjects evaluate classifier performance and interface efficacy. Classifiers correctly identified 97.6% of 1,200 cued gestures, and a drone correctly responded to 81.6% of 1,535 unstructured gestures as subjects remotely controlled it through target hoops during 119 minutes of total flight time. |
| Author | DelPreto, Joseph Rus, Daniela |
| Author_xml | – sequence: 1 givenname: Joseph surname: DelPreto fullname: DelPreto, Joseph email: delpreto@csail.mit.edu organization: Massachusetts Institute of Technology, Cambridge, MA, USA – sequence: 2 givenname: Daniela surname: Rus fullname: Rus, Daniela email: rus@csail.mit.edu organization: Massachusetts Institute of Technology, Cambridge, MA, USA |
| BookMark | eNqNkD1PwzAQQM2XRFs6M7B4ZEnwVxJ7YEBRKUitqASdLTu-VIHURnEy9N8T1A6MTCfdezqd3hRd-uABoVtKUkpF9sA5VRlhKeeFkIyfoem4JTwvRE7P0YTRvEgYFfLiL7hG8xg_CSFM5pRQPkGPm3bYJca7ZNOaA15C7IcOcBl834UWb2Pjd3g9xKoFPFp4HfomePwOPoYu3qCr2rQR5qc5Q9vnxUf5kqzelq_l0yoxPBN9YmtZKRBQGWddpeT4CmQ2N7WRUjoFRQ28cMY4khtrKyFsximQwgkGTAnFZ-jueLcBAP3dNXvTHbQSUjAmR3p_pKbaaxvCV9SU6N9K-lRJnyqNavpPVduugZr_ACGfZRs |
| ContentType | Conference Proceeding |
| Copyright | 2020 Owner/Author |
| Copyright_xml | – notice: 2020 Owner/Author |
| DBID | 6IE 6IL CBEJK RIE RIL |
| DOI | 10.1145/3319502.3374823 |
| DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Xplore POP ALL IEEE Xplore All Conference Proceedings IEEE Electronic Library (IEL) IEEE Proceedings Order Plans (POP All) 1998-Present |
| DatabaseTitleList | |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISBN | 1450367461 9781450367462 |
| EISSN | 2167-2148 |
| EndPage | 448 |
| ExternalDocumentID | 9484228 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: Boeing funderid: 10.13039/100000003 |
| GroupedDBID | 6IE 6IF 6IL 6IN ABLEC ACM ADPZR ALMA_UNASSIGNED_HOLDINGS APO BEFXN BFFAM BGNUA BKEBE BPEOZ CBEJK GUFHI IEGSK LHSKQ OCL RIE RIL AAWTH ADZIZ CHZPO |
| ID | FETCH-LOGICAL-a354t-bf8c9e4ecadbdc98746e5b6afa888d9e7fe37daad06abbc44b531e07d42e29493 |
| IEDL.DBID | RIE |
| ISBN | 1450367461 9781450367462 |
| ISICitedReferencesCount | 35 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000570011000046&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| IngestDate | Wed Aug 27 02:40:05 EDT 2025 Wed Jan 31 06:39:26 EST 2024 Sat Jun 15 16:36:36 EDT 2024 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | false |
| IsScholarly | false |
| Keywords | imu human-robot interaction plug-and-play robotics teleoperation wearable sensors machine learning emg gestures |
| Language | English |
| License | This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License. |
| LinkModel | DirectLink |
| MeetingName | HRI '20: ACM/IEEE International Conference on Human-Robot Interaction |
| MergedId | FETCHMERGED-LOGICAL-a354t-bf8c9e4ecadbdc98746e5b6afa888d9e7fe37daad06abbc44b531e07d42e29493 |
| OpenAccessLink | https://dl.acm.org/doi/pdf/10.1145/3319502.3374823 |
| PageCount | 10 |
| ParticipantIDs | ieee_primary_9484228 acm_books_10_1145_3319502_3374823_brief acm_books_10_1145_3319502_3374823 |
| PublicationCentury | 2000 |
| PublicationDate | 2020-03-09 |
| PublicationDateYYYYMMDD | 2020-03-09 |
| PublicationDate_xml | – month: 03 year: 2020 text: 2020-03-09 day: 09 |
| PublicationDecade | 2020 |
| PublicationPlace | New York, NY, USA |
| PublicationPlace_xml | – name: New York, NY, USA |
| PublicationSeriesTitle | ACM Conferences |
| PublicationTitle | 2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI) |
| PublicationTitleAbbrev | HRI |
| PublicationYear | 2020 |
| Publisher | ACM |
| Publisher_xml | – name: ACM |
| SSID | ssj0002861013 ssj0003204102 |
| Score | 1.9740807 |
| Snippet | As the capacity for machines to extend human capabilities continues to grow, the communication channels used must also expand. Allowing machines to interpret... |
| SourceID | ieee acm |
| SourceType | Publisher |
| StartPage | 439 |
| SubjectTerms | Computer systems organization -- Embedded and cyber-physical systems -- Robotics Computing methodologies -- Machine learning EMG Gestures Human computer interaction Human-centered computing -- Human computer interaction (HCI) Human-centered computing -- Human computer interaction (HCI) -- Interaction techniques -- Gestural input Human-robot interaction IMU Machine Learning Muscles Neural networks Pipelines Plug-and-Play Robotics Training data Vocabulary |
| Title | Plug-and-Play Gesture Control Using Muscle and Motion Sensors |
| URI | https://ieeexplore.ieee.org/document/9484228 |
| WOSCitedRecordID | wos000570011000046&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS8MwFD5swwd98TIv80YEwRezdUm6NA8-idOXjYEKvpVcTmWgm3Sr4L83actUEMS3toQSPpKcnOR83wdwrhXDyGSKJn6JpIGqSXU_UlRFzAknXd9mpjSbkONx8vSkJg24XHFhELEsPsNueCzv8t3cFuGorKdEEhSrmtCUUlZcrdV5Ckv8RqAOTeGds0j0Q_HOfiWIG_c4D5anrMuD4kppT6Tt6w9TlTKmDDf_15st2P0i55HJKuxsQwNnO7DxTVewDVeTl-KZ6pmjkxf9QW79yl_kSK6rsnRSlgmQUbHwQ4b4VmRUWvmQe5_SzvPFLjwObx6u72jtk0A1j8WSmiyxCgVa7YyzKpFigLEZ6Ez79NYplBly6bR20UAbY4UwfuJhJJ1gyJRQfA9as_kMD4D47RS3VimpjBED_wMuYiY4k9YkmbW8A2cetDQkAIu04jTHaQ1sWgPbgYs_26Qmn2LWgXaANX2rhDXSGtHD3z8fwToLyW4oAFPH0FrmBZ7Amn1fThf5aTkaPgG8ta8j |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1dS8MwFL3MKagvfk2dnxEEX4x2Sbo2Dz4Np6IbAyf4VvJxK4Ju0q2C_96kLVNBEN_aEko4JLm5yT3nABwryTDQqaSxWyKpp2pS1QoklQGzwka2ZVJdmE1E_X78-CgHNTidcWEQsSg-wzP_WNzl27HJ_VHZuRSxV6yag_lQCNYq2VqzExUWu61AFZz8O2eBaPnyna1SEjc859ybnrIz7jVXCoMiZV5_2KoUUaW78r_-rELji55HBrPAswY1HK3D8jdlwQ24GLzkT1SNLB28qA9y5db-PEPSKQvTSVEoQHr5xA0a4lqRXmHmQ-5dUjvOJg146F4OO9e0ckqgiodiSnUaG4kCjbLaGhlHoo2hbqtUuQTXSoxS5JFVygZtpbURQruph0FkBUMmheSbUB-NR7gNxG2ouDFSRlJr0XY_4CJkgrPI6Dg1hjfhyIGW-BRgkpSs5jCpgE0qYJtw8mebRGfPmDZhw8OavJXSGkmF6M7vnw9h8XrYu0vubvq3u7DEfOrry8HkHtSnWY77sGDep8-T7KAYGZ8Lu7Jq |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2020+15th+ACM%2FIEEE+International+Conference+on+Human-Robot+Interaction+%28HRI%29&rft.atitle=Plug-and-Play+Gesture+Control+Using+Muscle+and+Motion+Sensors&rft.au=DelPreto%2C+Joseph&rft.au=Rus%2C+Daniela&rft.date=2020-03-09&rft.pub=ACM&rft.eissn=2167-2148&rft.spage=439&rft.epage=448&rft_id=info:doi/10.1145%2F3319502.3374823&rft.externalDocID=9484228 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781450367462/lc.gif&client=summon&freeimage=true |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781450367462/mc.gif&client=summon&freeimage=true |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781450367462/sc.gif&client=summon&freeimage=true |

