Deep learning approach for human action recognition in infrared images

Human action recognition based Ambient assisted living (AAL) systems, targeted towards providing assistance for the elderly and persons with disabilities, have been of interest to researchers from various disciplines. The research primarily focuses on development of automatic, minimally intrusive an...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Cognitive systems research Jg. 50; S. 146 - 154
Hauptverfasser: Akula, Aparna, Shah, Anuj K., Ghosh, Ripul
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Elsevier B.V 01.08.2018
Schlagworte:
ISSN:1389-0417, 1389-0417
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Human action recognition based Ambient assisted living (AAL) systems, targeted towards providing assistance for the elderly and persons with disabilities, have been of interest to researchers from various disciplines. The research primarily focuses on development of automatic, minimally intrusive and privacy preserving systems. Although popular in the strategic sector, thermal infrared (IR) cameras haven’t been explored much in AAL. This work demonstrates the use of IR cameras in the field of AAL and discusses its performance in human action recognition (HAR). Particular attention is drawn towards one of the most critical actions - falling. In this reference, a dataset of IR images was generated comprising of 6 action classes – walking, standing, sitting on a chair, sitting on a chair with a desk in front, fallen on the desk in front and fallen/lying on the ground. The dataset comprises of 5278 image samples which have been randomly sampled from thermal videos, each of about 30 s, representing the six action classes. To achieve robust action recognition, we have designed the supervised Convolution Neural Network (CNN) architecture with two convolution layers to classify the 6 action classes. Classification accuracy of 87.44% has been achieved on the manually selected complex test data.
ISSN:1389-0417
1389-0417
DOI:10.1016/j.cogsys.2018.04.002