Jointly Optimized Regressors for Image Super-resolution

Learning regressors from low‐resolution patches to high‐resolution patches has shown promising results for image super‐resolution. We observe that some regressors are better at dealing with certain cases, and others with different cases. In this paper, we jointly learn a collection of regressors, wh...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum Jg. 34; H. 2; S. 95 - 104
Hauptverfasser: Dai, D., Timofte, R., Van Gool, L.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Oxford Blackwell Publishing Ltd 01.05.2015
Schlagworte:
ISSN:0167-7055, 1467-8659
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Learning regressors from low‐resolution patches to high‐resolution patches has shown promising results for image super‐resolution. We observe that some regressors are better at dealing with certain cases, and others with different cases. In this paper, we jointly learn a collection of regressors, which collectively yield the smallest super‐resolving error for all training data. After training, each training sample is associated with a label to indicate its ‘best’ regressor, the one yielding the smallest error. During testing, our method bases on the concept of ‘adaptive selection’ to select the most appropriate regressor for each input patch. We assume that similar patches can be super‐resolved by the same regressor and use a fast, approximate kNN approach to transfer the labels of training patches to test patches. The method is conceptually simple and computationally efficient, yet very effective. Experiments on four datasets show that our method outperforms competing methods.
Bibliographie:ArticleID:CGF12544
ark:/67375/WNG-97XLZ7NJ-D
istex:97791FCC7065BE3918BDB64FB6F198ECF175BA05
Supporting Information
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12544