Missing Texture Reconstruction Method Based on Perceptually Optimized Algorithm

This paper presents a simple and effective missing texture reconstruction method based on a perceptually optimized algorithm. The proposed method utilizes the structural similarity (SSIM) index as a new visual quality measure for reconstructing missing areas. Furthermore, in order to adaptively reco...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:EURASIP journal on advances in signal processing Ročník 2010; číslo 1; s. 208976
Hlavní autori: Ogawa, Takahiro, Haseyama, Miki
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Cham Springer International Publishing 01.01.2010
Springer Nature B.V
SpringerOpen
Predmet:
ISSN:1687-6180, 1687-6172, 1687-6180
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:This paper presents a simple and effective missing texture reconstruction method based on a perceptually optimized algorithm. The proposed method utilizes the structural similarity (SSIM) index as a new visual quality measure for reconstructing missing areas. Furthermore, in order to adaptively reconstruct target images containing several kinds of textures, the following two novel approaches are introduced into the SSIM-based reconstruction algorithm. First, the proposed method performs SSIM-based selection of the optimal known local textures to adaptively obtain subspaces for reconstructing missing textures. Secondly, missing texture reconstruction that maximizes the SSIM index in the known neighboring areas is performed. In this approach, the nonconvex maximization problem is reformulated as a quasi convex problem, and adaptive reconstruction of the missing textures based on the perceptually optimized algorithm becomes feasible. Experimental results show impressive improvements of the proposed method over previously reported reconstruction methods.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1687-6180
1687-6172
1687-6180
DOI:10.1155/2010/208976