mmJaw: Remote Jaw Gesture Recognition with COTS mmWave Radar

With the increasing prevalence of IoT devices and smart systems in daily life, there is a growing demand for new modalities in Human-Computer Interaction (HCI) to improve accessibility, particularly for users who require hands-free and eyes-free interaction in contexts like VR environments, as well...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings - International Conference on Parallel and Distributed Systems pp. 52 - 59
Main Authors: Siddiqi, Awais Ahmad, He, Yuan, Chen, Yande, Sun, Yimao, Wang, Shufan, Xie, Yadong
Format: Conference Proceeding
Language:English
Published: IEEE 10.10.2024
Subjects:
ISSN:2690-5965
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With the increasing prevalence of IoT devices and smart systems in daily life, there is a growing demand for new modalities in Human-Computer Interaction (HCI) to improve accessibility, particularly for users who require hands-free and eyes-free interaction in contexts like VR environments, as well as for individuals with special needs or limited mobility. In this paper, we propose teeth gestures as an input modality for HCI. We find that teeth gestures, such as tapping, clenching, and sliding, are generated by various facial muscle movements that are often imperceptible to the naked eye but can be effectively captured using mm-wave radar. By capturing and analyzing the distinct patterns of these muscle movements, we propose a hands-free and eyes-free HCI solution based on three different gestures. Key challenges addressed in this paper include user range identification amidst background noise and other irrelevant facial movements. Results from 16 volunteers demonstrate the robustness of our approach, achieving 93% accuracy for up to a 2.5m range.
ISSN:2690-5965
DOI:10.1109/ICPADS63350.2024.00017