Hand Gesture Recognition for Blind Users by Tracking 3D Gesture Trajectory

Hand gestures provide an alternate interaction modality for blind users and can be supported using commodity smartwatches without requiring specialized sensors. The enabling technology is an accurate gesture recognition algorithm, but almost all algorithms are designed for sighted users. Our study s...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Proceedings of the SIGCHI conference on human factors in computing systems. CHI Conference Ročník 2024
Hlavní autoři: Khanna, Prerna, Ramakrishnan, I V, Jain, Shubham, Bi, Xiaojun, Balasubramanian, Aruna
Médium: Journal Article
Jazyk:angličtina
Vydáno: United States 01.01.2024
Témata:
On-line přístup:Zjistit podrobnosti o přístupu
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Hand gestures provide an alternate interaction modality for blind users and can be supported using commodity smartwatches without requiring specialized sensors. The enabling technology is an accurate gesture recognition algorithm, but almost all algorithms are designed for sighted users. Our study shows that blind user gestures are considerably diferent from sighted users, rendering current recognition algorithms unsuitable. Blind user gestures have high inter-user variance, making learning gesture patterns difcult without large-scale training data. Instead, we design a gesture recognition algorithm that works on a 3D representation of the gesture trajectory, capturing motion in free space. Our insight is to extract a micro-movement in the gesture that is user-invariant and use this micro-movement for gesture classifcation. To this end, we develop an ensemble classifer that combines image classifcation with geometric properties of the gesture. Our evaluation demonstrates a 92% classifcation accuracy, surpassing the next best state-of-the-art which has an accuracy of 82%.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
DOI:10.1145/3613904.3642602