Electroencephalograph-Based Hand Movement Pattern Recognition for Prosthetic Robot Control Using a Combination of Long Short-Term Memory and Stacked Autoencoder Methods

Electroencephalograph (EEG) signals have expanded beyond the medical field into control systems. Improving EEG-based control technology is crucial to enhancing the quality of life for people with disabilities, especially in optimizing prosthetic functions. This research proposes a method to control...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2024 IEEE International Conference on Smart Mechatronics (ICSMech) s. 225 - 229
Hlavní autoři: Hana Sasono, Muchamad Arif, Akbar, Afgan Satrio, Fatoni, Moch. Rijal, Nanda Imron, Arizal Mujibtamala, Anam, Khairul
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 19.11.2024
Témata:
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Electroencephalograph (EEG) signals have expanded beyond the medical field into control systems. Improving EEG-based control technology is crucial to enhancing the quality of life for people with disabilities, especially in optimizing prosthetic functions. This research proposes a method to control a prosthetic hand robot using a combination of Long Short-Term Memory (LSTM) and Stacked Autoencoder (SAE) architecture based on EEG signals. Offline tests were conducted by adjusting various parameters on LSTM and SAE, achieving an average accuracy of 99.89% in single-subject training, indicating strong potential in functional hand motion pattern recognition. However, in cross-subject testing-where the model was tested on subjects other than those used in training-the performance significantly declined, with an average accuracy of 33.97%.
DOI:10.1109/ICSMech62936.2024.10812333