SuperBE: computationally light background estimation with superpixels

This paper presents a motion-based superpixel-level background estimation algorithm that aims to be competitively accurate while requiring less computation time for background modelling and updating. Superpixels are chosen for their spatial and colour coherency and can be grouped together to better...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of real-time image processing Jg. 16; H. 6; S. 2319 - 2335
Hauptverfasser: Chen, Andrew Tzer-Yeu, Biglari-Abhari, Morteza, Wang, Kevin I-Kai
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Berlin/Heidelberg Springer Berlin Heidelberg 01.12.2019
Springer Nature B.V
Schlagworte:
ISSN:1861-8200, 1861-8219
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a motion-based superpixel-level background estimation algorithm that aims to be competitively accurate while requiring less computation time for background modelling and updating. Superpixels are chosen for their spatial and colour coherency and can be grouped together to better define the shapes of objects in an image. RGB mean and colour covariance matrices are used as the discriminative features for comparing superpixels to their background model samples. The background model initialisation and update procedures are inspired by existing approaches, with the key aim of minimising computational complexity and therefore processing time. Experiments carried out with a widely used dataset show that SuperBE can achieve a high level of accuracy and is competitive against other state-of-the-art background estimation algorithms. The main contribution of this paper is the computationally efficient use of superpixels in background estimation while maintaining high accuracy, reaching 135 fps on 320 × 240 resolution images.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1861-8200
1861-8219
DOI:10.1007/s11554-018-0750-7