DeepFall: Non-Invasive Fall Detection with Deep Spatio-Temporal Convolutional Autoencoders

Human falls rarely occur; however, detecting falls is very important from the health and safety perspective. Due to the rarity of falls, it is difficult to employ supervised classification techniques to detect them. Moreover, in these highly skewed situations, it is also difficult to extract domain-...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of healthcare informatics research Jg. 4; H. 1; S. 50 - 70
Hauptverfasser: Nogas, Jacob, Khan, Shehroz S., Mihailidis, Alex
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Cham Springer International Publishing 01.03.2020
Schlagworte:
ISSN:2509-4971, 2509-498X
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Human falls rarely occur; however, detecting falls is very important from the health and safety perspective. Due to the rarity of falls, it is difficult to employ supervised classification techniques to detect them. Moreover, in these highly skewed situations, it is also difficult to extract domain-specific features to identify falls. In this paper, we present a novel framework, DeepFall , which formulates the fall detection problem as an anomaly detection problem. The DeepFall framework presents the novel use of deep spatio-temporal convolutional autoencoders to learn spatial and temporal features from normal activities using non-invasive sensing modalities. We also present a new anomaly scoring method that combines the reconstruction score of frames across a temporal window to detect unseen falls. We tested the DeepFall framework on three publicly available datasets collected through non-invasive sensing modalities, thermal camera and depth cameras, and show superior results in comparison with traditional autoencoder methods to identify unseen falls.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2509-4971
2509-498X
DOI:10.1007/s41666-019-00061-4