Electroencephalograph-Based Hand Movement Pattern Recognition for Prosthetic Robot Control Using a Combination of Long Short-Term Memory and Stacked Autoencoder Methods

Electroencephalograph (EEG) signals have expanded beyond the medical field into control systems. Improving EEG-based control technology is crucial to enhancing the quality of life for people with disabilities, especially in optimizing prosthetic functions. This research proposes a method to control...

Full description

Saved in:
Bibliographic Details
Published in:2024 IEEE International Conference on Smart Mechatronics (ICSMech) pp. 225 - 229
Main Authors: Hana Sasono, Muchamad Arif, Akbar, Afgan Satrio, Fatoni, Moch. Rijal, Nanda Imron, Arizal Mujibtamala, Anam, Khairul
Format: Conference Proceeding
Language:English
Published: IEEE 19.11.2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Electroencephalograph (EEG) signals have expanded beyond the medical field into control systems. Improving EEG-based control technology is crucial to enhancing the quality of life for people with disabilities, especially in optimizing prosthetic functions. This research proposes a method to control a prosthetic hand robot using a combination of Long Short-Term Memory (LSTM) and Stacked Autoencoder (SAE) architecture based on EEG signals. Offline tests were conducted by adjusting various parameters on LSTM and SAE, achieving an average accuracy of 99.89% in single-subject training, indicating strong potential in functional hand motion pattern recognition. However, in cross-subject testing-where the model was tested on subjects other than those used in training-the performance significantly declined, with an average accuracy of 33.97%.
DOI:10.1109/ICSMech62936.2024.10812333