Fire detection using surveillance systems

In this research, I present a video-based system to detect Fire in real time taking advantage of already existing surveillance systems for Fire detection either inside or outside the building, Detection of fires with surveillance cameras is characterized by early detection and rapid performance. Inf...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Advances in Computing and Engineering Ročník 4; číslo 1; s. 44 - 51
Hlavní autor: Mahmoud, Hanan Samir
Médium: Journal Article
Jazyk:angličtina
Vydáno: Academy Publishing Center 23.02.2024
Témata:
ISSN:2735-5977, 2735-5985
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this research, I present a video-based system to detect Fire in real time taking advantage of already existing surveillance systems for Fire detection either inside or outside the building, Detection of fires with surveillance cameras is characterized by early detection and rapid performance. Information about the progress of the fire can be obtained through live video. Also vision-based is capable of providing forensic evidence. The basic idea of the research is Fire detection  based on video; I proposed Fourier descriptors to describe reddish moving objects. The proposed system idea is to detect reddish moving bodies in every frame and correlate the detections with the same reddish bodiest over time. Multi-threshold segmentation is used to divide the image. This method can be integrated with pretreatment and post-processing. The threshold is one of the most common ways to divide the image. The next stage after the segmentation is to obtain the reddish body features. The feature is created by obtaining the contour of the reddish body and estimating the normalized Fourier descriptors of it. If  the reddish body contour's  Fourier descriptors vary from frame to frame then we can predict the fire. Received: 18 December 2023 Accepted: 06 February 2024 Published: 23 February 2024
ISSN:2735-5977
2735-5985
DOI:10.21622/ACE.2024.04.1.774