Efficient Approximate Online Convolutional Dictionary Learning

Most existing convolutional dictionary learning (CDL) algorithms are based on batch learning, where the dictionary filters and the convolutional sparse representations are optimized in an alternating manner using a training dataset. When large training datasets are used, batch CDL algorithms become...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on computational imaging Ročník 9; s. 1165 - 1175
Hlavní autoři: Veshki, Farshad G., Vorobyov, Sergiy A.
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2573-0436, 2333-9403
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Most existing convolutional dictionary learning (CDL) algorithms are based on batch learning, where the dictionary filters and the convolutional sparse representations are optimized in an alternating manner using a training dataset. When large training datasets are used, batch CDL algorithms become prohibitively memory-intensive. An online-learning technique is used to reduce the memory requirements of CDL by optimizing the dictionary incrementally after finding the sparse representations of each training sample. Nevertheless, learning large dictionaries using the existing online CDL (OCDL) algorithms remains highly computationally expensive. In this paper, we present a novel approximate OCDL method that incorporates sparse decomposition of the training samples. The resulting optimization problems are addressed using the alternating direction method of multipliers. Extensive experimental evaluations using several image datasets and based on an image fusion task show that the proposed method substantially reduces computational costs while preserving the effectiveness of the state-of-the-art OCDL algorithms.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2573-0436
2333-9403
DOI:10.1109/TCI.2023.3340612