Background-Subtraction Algorithm Optimization for Home Camera-Based Night-Vision Fall Detectors
Background subtraction is one of the key pre-processing steps necessary for obtaining relevant information from a video sequence. The selection of a background subtraction algorithm and its parameters is also important for achieving optimal detection performance, especially in night environments. Th...
Uloženo v:
| Vydáno v: | IEEE access Ročník 7; s. 152399 - 152411 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Piscataway
IEEE
2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 2169-3536, 2169-3536 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Background subtraction is one of the key pre-processing steps necessary for obtaining relevant information from a video sequence. The selection of a background subtraction algorithm and its parameters is also important for achieving optimal detection performance, especially in night environments. The research contribution presented in this paper is the identification of the optimal background subtractor algorithm in indoor night-time environments, with a focus on the detection of human falls. 30 background subtraction algorithms are analyzed to determine which has the best performance in indoor night-time environments. Genetic algorithms have been applied to identify the best background subtraction algorithm, to optimize the background subtractor parameters and to calculate the optimal number of pre- and post-processing operations. The results show that the best algorithm for fall-detection in indoor, night-time environments is the LBAdaptativeSOM, optimal parameters and processing operations for this algorithm are reported. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2169-3536 2169-3536 |
| DOI: | 10.1109/ACCESS.2019.2948321 |