Background-Subtraction Algorithm Optimization for Home Camera-Based Night-Vision Fall Detectors

Background subtraction is one of the key pre-processing steps necessary for obtaining relevant information from a video sequence. The selection of a background subtraction algorithm and its parameters is also important for achieving optimal detection performance, especially in night environments. Th...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 7; pp. 152399 - 152411
Main Authors: Alonso, Mercedes, Brunete, Alberto, Hernando, Miguel, Gambao, Ernesto
Format: Journal Article
Language:English
Published: Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:2169-3536, 2169-3536
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Background subtraction is one of the key pre-processing steps necessary for obtaining relevant information from a video sequence. The selection of a background subtraction algorithm and its parameters is also important for achieving optimal detection performance, especially in night environments. The research contribution presented in this paper is the identification of the optimal background subtractor algorithm in indoor night-time environments, with a focus on the detection of human falls. 30 background subtraction algorithms are analyzed to determine which has the best performance in indoor night-time environments. Genetic algorithms have been applied to identify the best background subtraction algorithm, to optimize the background subtractor parameters and to calculate the optimal number of pre- and post-processing operations. The results show that the best algorithm for fall-detection in indoor, night-time environments is the LBAdaptativeSOM, optimal parameters and processing operations for this algorithm are reported.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2948321