Human Gait Recognition System Based on Support Vector Machine Algorithm and Using Wearable Sensors

Human gait recognition is very important for controlling exoskeletons and achieving smooth transformations. Gait information must be obtained accurately. Therefore, in order to accurately control the exoskeleton movement, a multisensor fusion gait recognition system was developed in this study. The...

Full description

Saved in:
Bibliographic Details
Published in:Sensors and materials Vol. 31; no. 4; p. 1335
Main Authors: Wang, Fangzheng, Yan, Lei, Xiao, Jiang
Format: Journal Article
Language:English
Published: Tokyo MYU Scientific Publishing Division 01.01.2019
Subjects:
ISSN:0914-4935
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Human gait recognition is very important for controlling exoskeletons and achieving smooth transformations. Gait information must be obtained accurately. Therefore, in order to accurately control the exoskeleton movement, a multisensor fusion gait recognition system was developed in this study. The system acquires plantar pressure and acceleration signals of human legs. In the experiment, we collected the pressure signals of both feet and the movement data of the waist, left thigh, left calf, right thigh, and right calf of five test subjects. We investigated the gaits of standing, level walking, going up the stairs, going down the stairs, going up the slope, and going down the slope. The gait recognition accuracy of support vector machine (SVM), back propagation (BP) neural network and radial basis function (RBF) neural network were compared. The different sliding window sizes of SVM algorithm were analyzed. The results showed that the recognition rate was higher for the SVM algorithm with an average recognition accuracy of 96.5%. The accurate recognition of the human gait provides a good theoretical basis for the design of an exoskeleton robot control strategy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0914-4935
DOI:10.18494/SAM.2019.2288