Human Gait Recognition System Based on Support Vector Machine Algorithm and Using Wearable Sensors

Human gait recognition is very important for controlling exoskeletons and achieving smooth transformations. Gait information must be obtained accurately. Therefore, in order to accurately control the exoskeleton movement, a multisensor fusion gait recognition system was developed in this study. The...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors and materials Jg. 31; H. 4; S. 1335
Hauptverfasser: Wang, Fangzheng, Yan, Lei, Xiao, Jiang
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Tokyo MYU Scientific Publishing Division 01.01.2019
Schlagworte:
ISSN:0914-4935
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Human gait recognition is very important for controlling exoskeletons and achieving smooth transformations. Gait information must be obtained accurately. Therefore, in order to accurately control the exoskeleton movement, a multisensor fusion gait recognition system was developed in this study. The system acquires plantar pressure and acceleration signals of human legs. In the experiment, we collected the pressure signals of both feet and the movement data of the waist, left thigh, left calf, right thigh, and right calf of five test subjects. We investigated the gaits of standing, level walking, going up the stairs, going down the stairs, going up the slope, and going down the slope. The gait recognition accuracy of support vector machine (SVM), back propagation (BP) neural network and radial basis function (RBF) neural network were compared. The different sliding window sizes of SVM algorithm were analyzed. The results showed that the recognition rate was higher for the SVM algorithm with an average recognition accuracy of 96.5%. The accurate recognition of the human gait provides a good theoretical basis for the design of an exoskeleton robot control strategy.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0914-4935
DOI:10.18494/SAM.2019.2288