Image Restoration via Reconciliation of Group Sparsity and Low-Rank Models

Image nonlocal self-similarity (NSS) property has been widely exploited via various sparsity models such as joint sparsity (JS) and group sparse coding (GSC). However, the existing NSS-based sparsity models are either too restrictive, e.g. , JS enforces the sparse codes to share the same support, or...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on image processing Ročník 30; s. 5223 - 5238
Hlavní autori: Zha, Zhiyuan, Wen, Bihan, Yuan, Xin, Zhou, Jiantao, Zhu, Ce
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Image nonlocal self-similarity (NSS) property has been widely exploited via various sparsity models such as joint sparsity (JS) and group sparse coding (GSC). However, the existing NSS-based sparsity models are either too restrictive, e.g. , JS enforces the sparse codes to share the same support, or too general, e.g. , GSC imposes only plain sparsity on the group coefficients, which limit their effectiveness for modeling real images. In this paper, we propose a novel NSS-based sparsity model, namely, low-rank regularized group sparse coding (LR-GSC) , to bridge the gap between the popular GSC and JS. The proposed LR-GSC model simultaneously exploits the sparsity and low-rankness of the dictionary-domain coefficients for each group of similar patches. An alternating minimization with an adaptive adjusted parameter strategy is developed to solve the proposed optimization problem for different image restoration tasks, including image denoising, image deblocking, image inpainting, and image compressive sensing. Extensive experimental results demonstrate that the proposed LR-GSC algorithm outperforms many popular or state-of-the-art methods in terms of objective and perceptual metrics.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2021.3078329