Deep-anomaly: Fully convolutional neural network for fast anomaly detection in crowded scenes

•This paper is one of the first where fully convolutional neural network is used for anomaly detection.•Adapting a pre-trained classification CNN to an FCN for generating video regions to describe motion and shape concurrently.•Proposing a new FCN architecture for time-efficient anomaly detection an...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer vision and image understanding Jg. 172; S. 88 - 97
Hauptverfasser: Sabokrou, Mohammad, Fayyaz, Mohsen, Fathy, Mahmood, Moayed, Zahra, Klette, Reinhard
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Elsevier Inc 01.07.2018
Schlagworte:
ISSN:1077-3142, 1090-235X
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•This paper is one of the first where fully convolutional neural network is used for anomaly detection.•Adapting a pre-trained classification CNN to an FCN for generating video regions to describe motion and shape concurrently.•Proposing a new FCN architecture for time-efficient anomaly detection and localization.•The proposed method performs as well as state-of-the-art methods, but our method outperforms those with respect to time; we ensure real-time for typical applications. The detection of abnormal behaviour in crowded scenes has to deal with many challenges. This paper presents an efficient method for detection and localization of anomalies in videos. Using fully convolutional neural networks (FCNs) and temporal data, a pre-trained supervised FCN is transferred into an unsupervised FCN ensuring the detection of (global) anomalies in scenes. High performance in terms of speed and accuracy is achieved by investigating the cascaded detection as a result of reducing computation complexities. This FCN-based architecture addresses two main tasks, feature representation and cascaded outlier detection. Experimental results on two benchmarks suggest that the proposed method outperforms existing methods in terms of accuracy regarding detection and localization.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2018.02.006