Deep learning approach for human action recognition in infrared images

Human action recognition based Ambient assisted living (AAL) systems, targeted towards providing assistance for the elderly and persons with disabilities, have been of interest to researchers from various disciplines. The research primarily focuses on development of automatic, minimally intrusive an...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Cognitive systems research Ročník 50; s. 146 - 154
Hlavní autoři: Akula, Aparna, Shah, Anuj K., Ghosh, Ripul
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier B.V 01.08.2018
Témata:
ISSN:1389-0417, 1389-0417
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Human action recognition based Ambient assisted living (AAL) systems, targeted towards providing assistance for the elderly and persons with disabilities, have been of interest to researchers from various disciplines. The research primarily focuses on development of automatic, minimally intrusive and privacy preserving systems. Although popular in the strategic sector, thermal infrared (IR) cameras haven’t been explored much in AAL. This work demonstrates the use of IR cameras in the field of AAL and discusses its performance in human action recognition (HAR). Particular attention is drawn towards one of the most critical actions - falling. In this reference, a dataset of IR images was generated comprising of 6 action classes – walking, standing, sitting on a chair, sitting on a chair with a desk in front, fallen on the desk in front and fallen/lying on the ground. The dataset comprises of 5278 image samples which have been randomly sampled from thermal videos, each of about 30 s, representing the six action classes. To achieve robust action recognition, we have designed the supervised Convolution Neural Network (CNN) architecture with two convolution layers to classify the 6 action classes. Classification accuracy of 87.44% has been achieved on the manually selected complex test data.
ISSN:1389-0417
1389-0417
DOI:10.1016/j.cogsys.2018.04.002