mmJaw: Remote Jaw Gesture Recognition with COTS mmWave Radar

With the increasing prevalence of IoT devices and smart systems in daily life, there is a growing demand for new modalities in Human-Computer Interaction (HCI) to improve accessibility, particularly for users who require hands-free and eyes-free interaction in contexts like VR environments, as well...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings - International Conference on Parallel and Distributed Systems S. 52 - 59
Hauptverfasser: Siddiqi, Awais Ahmad, He, Yuan, Chen, Yande, Sun, Yimao, Wang, Shufan, Xie, Yadong
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 10.10.2024
Schlagworte:
ISSN:2690-5965
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the increasing prevalence of IoT devices and smart systems in daily life, there is a growing demand for new modalities in Human-Computer Interaction (HCI) to improve accessibility, particularly for users who require hands-free and eyes-free interaction in contexts like VR environments, as well as for individuals with special needs or limited mobility. In this paper, we propose teeth gestures as an input modality for HCI. We find that teeth gestures, such as tapping, clenching, and sliding, are generated by various facial muscle movements that are often imperceptible to the naked eye but can be effectively captured using mm-wave radar. By capturing and analyzing the distinct patterns of these muscle movements, we propose a hands-free and eyes-free HCI solution based on three different gestures. Key challenges addressed in this paper include user range identification amidst background noise and other irrelevant facial movements. Results from 16 volunteers demonstrate the robustness of our approach, achieving 93% accuracy for up to a 2.5m range.
ISSN:2690-5965
DOI:10.1109/ICPADS63350.2024.00017