A modified inter-frame difference method for detection of moving objects in videos

Many methods based on frame difference are developed for the detection of objects in a video. However, selecting moving pixels relevant to an object in a video and detecting the object under different environmental conditions is still a challenging task. In this work, a modified inter-frame differen...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:International journal of information technology (Singapore. Online) Ročník 17; číslo 2; s. 749 - 754
Hlavní autoři: Srinivas, Yara, Ganivada, Avatharam
Médium: Journal Article
Jazyk:angličtina
Vydáno: Singapore Springer Nature Singapore 01.03.2025
Springer Nature B.V
Témata:
ISSN:2511-2104, 2511-2112
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Many methods based on frame difference are developed for the detection of objects in a video. However, selecting moving pixels relevant to an object in a video and detecting the object under different environmental conditions is still a challenging task. In this work, a modified inter-frame difference (MIFD) method for detecting moving objects in a video at various conditions is proposed. The method constructs a motion feature matrix with enhanced intensity values, say motion frame. These intensity values are based on a product of a constant parameter ( β ). The values of β are different for different video sequences. The proposed model’s motion feature matrix enhances the relevance of pixels associated with an object. It leads to more accurate detection. Ostu’s threshold method is used in the detection process. We experimentally examine the performance of the proposed model using benchmark datasets including changedetection.net (CDNet2014 dataset). The superior performance of the proposed model compared to state-of-the-art methods is demonstrated on the CDNet2014 dataset.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2511-2104
2511-2112
DOI:10.1007/s41870-024-02355-2