An Efficient Deep Learning Algorithm for Fire and Smoke Detection with Limited Data

Detecting smoke and fire from visual scenes is a demanding task, due to the high variance of the color and texture. A number of smoke and fire image classification approaches have been proposed to overcome this problem; however, most of them rely on either rule-based methods or on handcrafted featur...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Advances in Electrical and Computer Engineering Ročník 18; číslo 4; s. 121 - 128
Hlavní autori: NAMOZOV, A., CHO, Y. I.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Suceava Stefan cel Mare University of Suceava 01.11.2018
Predmet:
ISSN:1582-7445, 1844-7600
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Detecting smoke and fire from visual scenes is a demanding task, due to the high variance of the color and texture. A number of smoke and fire image classification approaches have been proposed to overcome this problem; however, most of them rely on either rule-based methods or on handcrafted features. We propose a novel deep convolutional neural network algorithm to achieve high-accuracy fire and smoke image detection. Instead of using traditional rectified linear units or tangent functions, we use adaptive piecewise linear units in the hidden layers of the network. We also have created a new small dataset of fire and smoke images to train and evaluate our model. To solve the overfitting problem caused by training the network on a limited dataset, we improve the number of available training images using traditional data augmentation techniques and generative adversarial networks. Experimental results show that the proposed approach achieves high accuracy and a high detection rate, as well as a very low rate of false alarms.Index Terms--smoke detectors, neural networks, image classification, image recognition, image generation.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1582-7445
1844-7600
DOI:10.4316/AECE.2018.04015