Normal and pathological gait classification LSTM model

Computer vision-based clinical gait analysis is the subject of permanent research. However, there are very few datasets publicly available; hence the comparison of existing methods between each other is not straightforward. Even if the test data are in an open access, existing databases contain very...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Artificial intelligence in medicine Ročník 94; s. 54 - 66
Hlavní autoři: Khokhlova, Margarita, Migniot, Cyrille, Morozov, Alexey, Sushkova, Olga, Dipanda, Albert
Médium: Journal Article
Jazyk:angličtina
Vydáno: Netherlands Elsevier B.V 01.03.2019
Elsevier
Témata:
ISSN:0933-3657, 1873-2860, 1873-2860
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Computer vision-based clinical gait analysis is the subject of permanent research. However, there are very few datasets publicly available; hence the comparison of existing methods between each other is not straightforward. Even if the test data are in an open access, existing databases contain very few test subjects and single modality measurements, which limit their usage. The contributions of this paper are three-fold. First, we propose a new open-access multi-modal database acquired with the Kinect v.2 camera for the task of gait analysis. Second, we adapt to use the skeleton joint orientation data to calculate kinematic gait parameters to match golden-standard MOCAP systems. We propose a new set of features based on 3D low-limbs flexion dynamics to analyze the symmetry of a gait. Third, we design a Long-Short Term Memory (LSTM) ensemble model to create an unsupervised gait classification tool. The results show that joint orientation data provided by Kinect can be successfully used in an inexpensive clinical gait monitoring system, with the results moderately better than reported state-of-the-art for three normal/pathological gait classes.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0933-3657
1873-2860
1873-2860
DOI:10.1016/j.artmed.2018.12.007