An Efficient Deep Learning Algorithm for Fire and Smoke Detection with Limited Data

Detecting smoke and fire from visual scenes is a demanding task, due to the high variance of the color and texture. A number of smoke and fire image classification approaches have been proposed to overcome this problem; however, most of them rely on either rule-based methods or on handcrafted featur...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Advances in Electrical and Computer Engineering Jg. 18; H. 4; S. 121 - 128
Hauptverfasser: NAMOZOV, A., CHO, Y. I.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Suceava Stefan cel Mare University of Suceava 01.11.2018
Schlagworte:
ISSN:1582-7445, 1844-7600
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Detecting smoke and fire from visual scenes is a demanding task, due to the high variance of the color and texture. A number of smoke and fire image classification approaches have been proposed to overcome this problem; however, most of them rely on either rule-based methods or on handcrafted features. We propose a novel deep convolutional neural network algorithm to achieve high-accuracy fire and smoke image detection. Instead of using traditional rectified linear units or tangent functions, we use adaptive piecewise linear units in the hidden layers of the network. We also have created a new small dataset of fire and smoke images to train and evaluate our model. To solve the overfitting problem caused by training the network on a limited dataset, we improve the number of available training images using traditional data augmentation techniques and generative adversarial networks. Experimental results show that the proposed approach achieves high accuracy and a high detection rate, as well as a very low rate of false alarms.Index Terms--smoke detectors, neural networks, image classification, image recognition, image generation.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1582-7445
1844-7600
DOI:10.4316/AECE.2018.04015