Saliency based human fall detection in smart home environments using posture recognition

The implementation of people monitoring system is an evolving research theme. This paper introduces an elderly monitoring system that recognizes human posture from overlapping cameras for people fall detection in a smart home environment. In these environments, the zone of movement is limited. Our a...

Full description

Saved in:
Bibliographic Details
Published in:Health informatics journal Vol. 27; no. 3; p. 14604582211030954
Main Authors: Mousse, Mikael Ange, Atohoun, Béthel
Format: Journal Article
Language:English
Published: London, England SAGE Publications 01.07.2021
SAGE PUBLICATIONS, INC
Subjects:
ISSN:1460-4582, 1741-2811, 1741-2811
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The implementation of people monitoring system is an evolving research theme. This paper introduces an elderly monitoring system that recognizes human posture from overlapping cameras for people fall detection in a smart home environment. In these environments, the zone of movement is limited. Our approach used this characteristic to recognize human posture fastly by proposing a region-wise modelling approach. It classifies persons pose in four groups: standing, crouching, sitting and lying on the floor. These postures are obtained by calculating an estimation of the human bounding volume. This volume is estimated by obtaining the height of the person and its surface that is in contact with the ground according to the foreground information of each camera. Using them, we distinguish each postures and differentiate lying on floor posture, which can be considered as the falling posture from other postures. The global multiview information of the scene is obtaining by using homographic projection. We test our proposed algorithm on multiple cameras based fall detection public dataset and the results prove the efficiency of our method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1460-4582
1741-2811
1741-2811
DOI:10.1177/14604582211030954