A Wearable Multisensor Fusion System for Neuroprosthetic Hand

A neural interface translating human motor intentions into control commands for prosthetic hands helps amputees restore upper limb function. However, commercial neural interfaces with a few surface electromyography (sEMG) sensors are constrained by limitations, such as low spatiotemporal resolution,...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE sensors journal Ročník 25; číslo 8; s. 12547 - 12558
Hlavní autori: Yin, Zongtian, Meng, Jianjun, Shi, Shang, Guo, Weichao, Yang, Xingchen, Ding, Han, Liu, Honghai
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 15.04.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1530-437X, 1558-1748
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:A neural interface translating human motor intentions into control commands for prosthetic hands helps amputees restore upper limb function. However, commercial neural interfaces with a few surface electromyography (sEMG) sensors are constrained by limitations, such as low spatiotemporal resolution, limited number of recognizable hand gestures, and sensitivity to arm positions. Multimodal sensor fusion presents a viable approach to overcome these challenges, offering improved accuracy, versatility, and robustness in gesture recognition. In this study, we developed a wearable multisensor fusion system compact enough to be integrated into a prosthetic socket. The fusion probe had dimensions of <inline-formula> <tex-math notation="LaTeX">38.5\times 20.5\times 13.5 </tex-math></inline-formula> mm, and the signal acquisition/processing device measured <inline-formula> <tex-math notation="LaTeX">50\times {40} \times {15} </tex-math></inline-formula> mm. The fusion system incorporated three types of sensors, capturing muscle movements from morphology (A-mode ultrasound), electrophysiology (sEMG), and kinematics inertial measurement unit (IMU). Gesture recognition experiments were conducted with 20 subjects, including both healthy individuals and amputees, achieving classification accuracies of 94.8% <inline-formula> <tex-math notation="LaTeX">\pm ~1.1 </tex-math></inline-formula>% and 96.9% <inline-formula> <tex-math notation="LaTeX">\pm ~1.3 </tex-math></inline-formula>% for six common gestures, respectively. Furthermore, we proposed a new control strategy based on the characteristics of sensor fusion to enhance the stability of online gesture classification. Practical online testing with amputees wearing prostheses indicated that the designed fusion system had high classification accuracy and stability during gesture recognition. These results demonstrated that the wearable multisensor fusion system is well-suited for integration into prostheses, offering a robust solution for amputees' practical use.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2025.3546214