An Effective Subsuperpixel-Based Approach for Background Subtraction

How to achieve competitive accuracy and less computation time simultaneously for background estimation is still an intractable task. In this paper, an effective background subtraction approach for video sequences is proposed based on a subsuperpixel model. In our algorithm, the superpixels of the fi...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on industrial electronics (1982) Ročník 67; číslo 1; s. 601 - 609
Hlavní autoři: Chen, Yu-Qiu, Sun, Zhan-Li, Lam, Kin-Man
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 01.01.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:0278-0046, 1557-9948
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:How to achieve competitive accuracy and less computation time simultaneously for background estimation is still an intractable task. In this paper, an effective background subtraction approach for video sequences is proposed based on a subsuperpixel model. In our algorithm, the superpixels of the first frame are constructed using a simple linear iterative clustering method. After transforming the frame from a color format to gray level, the initial superpixels are divided into K smaller units, i.e., subsuperpixels, via the k-means clustering algorithm. The background model is then initialized by representing each subsuperpixel as a multidimensional feature vector. For the subsequent frames, moving objects are detected by the subsuperpixel representation and a weighting measure. In order to deal with ghost artifacts, a background model updating strategy is devised, based on the number of pixels represented by each cluster center. As each superpixel is refined via the subsuperpixel representation, the proposed method is more efficient and achieves a competitive accuracy for background subtraction. Experimental results demonstrate the effectiveness of the proposed method.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0278-0046
1557-9948
DOI:10.1109/TIE.2019.2893824