Saliency based human fall detection in smart home environments using posture recognition

The implementation of people monitoring system is an evolving research theme. This paper introduces an elderly monitoring system that recognizes human posture from overlapping cameras for people fall detection in a smart home environment. In these environments, the zone of movement is limited. Our a...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Health informatics journal Ročník 27; číslo 3; s. 14604582211030954
Hlavní autoři: Mousse, Mikael Ange, Atohoun, Béthel
Médium: Journal Article
Jazyk:angličtina
Vydáno: London, England SAGE Publications 01.07.2021
SAGE PUBLICATIONS, INC
Témata:
ISSN:1460-4582, 1741-2811, 1741-2811
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The implementation of people monitoring system is an evolving research theme. This paper introduces an elderly monitoring system that recognizes human posture from overlapping cameras for people fall detection in a smart home environment. In these environments, the zone of movement is limited. Our approach used this characteristic to recognize human posture fastly by proposing a region-wise modelling approach. It classifies persons pose in four groups: standing, crouching, sitting and lying on the floor. These postures are obtained by calculating an estimation of the human bounding volume. This volume is estimated by obtaining the height of the person and its surface that is in contact with the ground according to the foreground information of each camera. Using them, we distinguish each postures and differentiate lying on floor posture, which can be considered as the falling posture from other postures. The global multiview information of the scene is obtaining by using homographic projection. We test our proposed algorithm on multiple cameras based fall detection public dataset and the results prove the efficiency of our method.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1460-4582
1741-2811
1741-2811
DOI:10.1177/14604582211030954