Deep Detail Enhancement for Any Garment.

Gespeichert in:
Bibliographische Detailangaben
Titel: Deep Detail Enhancement for Any Garment.
Autoren: Zhang, Meng, Wang, Tuanfeng, Ceylan, Duygu, Mitra, Niloy J.
Quelle: Computer Graphics Forum; May2021, Vol. 40 Issue 2, p399-411, 13p, 9 Color Photographs, 2 Diagrams, 2 Charts, 2 Graphs
Schlagwörter: SEWING patterns, PARAMETERIZATION, DEEP learning, SCANNING systems, PRODUCTION methods
Abstract: Creating fine garment details requires significant efforts and huge computational resources. In contrast, a coarse shape may be easy to acquire in many scenarios (e.g., via low‐resolution physically‐based simulation, linear blend skinning driven by skeletal motion, portable scanners). In this paper, we show how to enhance, in a data‐driven manner, rich yet plausible details starting from a coarse garment geometry. Once the parameterization of the garment is given, we formulate the task as a style transfer problem over the space of associated normal maps. In order to facilitate generalization across garment types and character motions, we introduce a patch‐based formulation, that produces high‐resolution details by matching a Gram matrix based style loss, to hallucinate geometric details (i.e., wrinkle density and shape). We extensively evaluate our method on a variety of production scenarios and show that our method is simple, light‐weight, efficient, and generalizes across underlying garment types, sewing patterns, and body motion. Project page: http://geometry.cs.ucl.ac.uk/projects/2021/DeepDetailEnhance/ [ABSTRACT FROM AUTHOR]
Copyright of Computer Graphics Forum is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Datenbank: Complementary Index
Beschreibung
Abstract:Creating fine garment details requires significant efforts and huge computational resources. In contrast, a coarse shape may be easy to acquire in many scenarios (e.g., via low‐resolution physically‐based simulation, linear blend skinning driven by skeletal motion, portable scanners). In this paper, we show how to enhance, in a data‐driven manner, rich yet plausible details starting from a coarse garment geometry. Once the parameterization of the garment is given, we formulate the task as a style transfer problem over the space of associated normal maps. In order to facilitate generalization across garment types and character motions, we introduce a patch‐based formulation, that produces high‐resolution details by matching a Gram matrix based style loss, to hallucinate geometric details (i.e., wrinkle density and shape). We extensively evaluate our method on a variety of production scenarios and show that our method is simple, light‐weight, efficient, and generalizes across underlying garment types, sewing patterns, and body motion. Project page: http://geometry.cs.ucl.ac.uk/projects/2021/DeepDetailEnhance/ [ABSTRACT FROM AUTHOR]
ISSN:01677055
DOI:10.1111/cgf.142642