Scalable Online Convolutional Sparse Coding

Convolutional sparse coding (CSC) improves sparse coding by learning a shift-invariant dictionary from the data. However, most existing CSC algorithms operate in the batch mode and are computationally expensive. In this paper, we alleviate this problem by online learning. The key is a reformulation...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing Jg. 27; H. 10; S. 4850 - 4859
Hauptverfasser: Yaqing Wang, Quanming Yao, Kwok, James T., Ni, Lionel M.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States IEEE 01.10.2018
Schlagworte:
ISSN:1057-7149, 1941-0042, 1941-0042
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Convolutional sparse coding (CSC) improves sparse coding by learning a shift-invariant dictionary from the data. However, most existing CSC algorithms operate in the batch mode and are computationally expensive. In this paper, we alleviate this problem by online learning. The key is a reformulation of the CSC objective so that convolution can be handled easily in the frequency domain, and much smaller history matrices are needed. To solve the resultant optimization problem, we use the alternating direction method of multipliers (ADMMs), and its subproblems have efficient closed-form solutions. Theoretical analysis shows that the learned dictionary converges to a stationary point of the optimization problem. Extensive experiments are performed on both the standard CSC benchmark data sets and much larger data sets such as the ImageNet. Results show that the proposed algorithm outperforms the state-of-the-art batch and online CSC methods. It is more scalable, has faster convergence, and better reconstruction performance.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2018.2842152