Texture Synthesis From Photographs

The goal of texture synthesis is to generate an arbitrarily large high‐quality texture from a small input sample. Generally, it is assumed that the input image is given as a flat, square piece of texture, thus it has to be carefully prepared from a picture taken under ideal conditions. Instead we wo...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 27; no. 2; pp. 419 - 428
Main Authors: Eisenacher, C., Lefebvre, S., Stamminger, M.
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing Ltd 01.04.2008
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The goal of texture synthesis is to generate an arbitrarily large high‐quality texture from a small input sample. Generally, it is assumed that the input image is given as a flat, square piece of texture, thus it has to be carefully prepared from a picture taken under ideal conditions. Instead we would like to extract the input texture from any surface from within an arbitrary photograph. This introduces several challenges: Only parts of the photograph are covered with the texture of interest, perspective and scene geometry introduce distortions, and the texture is non‐uniformly sampled during the capture process. This breaks many of the assumptions used for synthesis. In this paper we combine a simple novel user interface with a generic per‐pixel synthesis algorithm to achieve high‐quality synthesis from a photograph. Our interface lets the user locally describe the geometry supporting the textures by combining rational Bézier patches. These are particularly well suited to describe curved surfaces under projection. Further, we extend per‐pixel synthesis to account for arbitrary texture sparsity and distortion, both in the input image and in the synthesis output. Applications range from synthesizing textures directly from photographs to high‐quality texture completion.
Bibliography:ArticleID:CGF1139
ark:/67375/WNG-L3MRNTWJ-G
istex:C0516808293AE1184A22C002A29FACD5D50F31F2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2008.01139.x