Hand Gesture Classifier Using Edge Artificial Intelligence

This document presents the development of an edge artificial intelligence application for hand gesture recognition using an Inertial Measurement Unit (IMU) LSM9DS1. The study aimed to differentiate between four specific hand gestures: upward movement for forward, downward movement for stop, leftward...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE Latin American Conference on Computational Intelligence (Online) s. 1 - 6
Hlavní autoři: Bautista Cifuentes, Juan Camilo, Marin Patino, Sergio Andres, Mahecha, Esteban Morales, Alberto Chaparro, Javier
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 13.11.2024
Témata:
ISSN:2769-7622
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This document presents the development of an edge artificial intelligence application for hand gesture recognition using an Inertial Measurement Unit (IMU) LSM9DS1. The study aimed to differentiate between four specific hand gestures: upward movement for forward, downward movement for stop, leftward movement for turn left, and rightward movement for turn right. Such gesture recognition systems can be crucial in enhancing human-computer interaction, enabling more intuitive control mechanisms for various applications, including assistive technologies for people with disabilities and innovative user interfaces for smart devices. The IMU was integrated into an Arduino Nano 33 BLE Sense board to capture acceleration and gyroscope values in three Cartesian axes (X, Y, Z) as a reference point. A dataset was created with 150 repetitions of each gesture, totaling 450 repetitions per gesture, involving all group members as test subjects. Data collection was facilitated using the Arduino IDE and a specialized library for the IMU, with data migration to an Excel spreadsheet for analysis. The accelerometer and gyroscope signals were analyzed, characterized by the variance in a time window of 0.2 seconds without overlap. The following methods were evaluated: K-Nearest Neighbors (KNN), Naive Bayes Gaussian (NBG), Logistic Regression (LR), Decision Tree (DT), Random Forest (RF), Support Vector Machines (SVM), and Multilayer Perceptron (MLP). The best performance was achieved by the Bayesian classifier, which was implemented in the 33BLEsense SoC. The results demonstrated the circuit's effectiveness in accurately recognizing and differentiating between the four gestures, using less than 10% of program memory and maintaining low computational consumption.
ISSN:2769-7622
DOI:10.1109/LA-CCI62337.2024.10814802