Moving Object Detection Algorithm based on Improved Visual Background Extractor

Aiming at the problems that the ViBe algorithm causes error detection when the camera jitter, ghosts appear when processing moving targets, and the detected moving targets are incomplete, an improved ViBe algorithm based on motion compensation is proposed. In the background modeling stage, the motio...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of physics. Conference series Ročník 1732; číslo 1; s. 12078 - 12084
Hlavní autoři: Yang, Xuewen, Liu, Tangyou
Médium: Journal Article
Jazyk:angličtina
Vydáno: Bristol IOP Publishing 01.01.2021
Témata:
ISSN:1742-6588, 1742-6596
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Aiming at the problems that the ViBe algorithm causes error detection when the camera jitter, ghosts appear when processing moving targets, and the detected moving targets are incomplete, an improved ViBe algorithm based on motion compensation is proposed. In the background modeling stage, the motion compensation method based on KLT is used to obtain background model to enhance the robustness of the algorithm to the dynamic background. In the foreground detection stage, combined the background model of the current pixel with 8 neighbor pixel background model, by the double discriminant algorithm to eliminate ghost areas caused by detecting the real background points as foreground points, introduce OTSU algorithm to obtain the optimal threshold for foreground detection. In the post processing stage, filter the connected components for segmentation masks and update models to limit the wrong background points scattered in the foreground.Finally, the connected domain is repaired by morphological opening operation. The results show that the algorithm can effectively eliminate the interference caused by camera jitter, and suppress the ghost phenomenon to obtain more accurate foreground images.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1732/1/012078