A Domain-Guided Noise-Optimization-Based Inversion Method for Facial Image Manipulation

A style-based architecture (StyleGAN2) yields outstanding results in data-driven unconditional generative image modeling. This work proposes a Domain-guided Noise-optimization-based Inversion (DNI) method to perform facial image manipulation. It works based on an inverse code that includes: 1) a nov...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on image processing Ročník 30; s. 6198 - 6211
Hlavní autori: Yang, Nan, Zheng, Zeyu, Zhou, Mengchu, Guo, Xiwang, Qi, Liang, Wang, Tianran
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:A style-based architecture (StyleGAN2) yields outstanding results in data-driven unconditional generative image modeling. This work proposes a Domain-guided Noise-optimization-based Inversion (DNI) method to perform facial image manipulation. It works based on an inverse code that includes: 1) a novel domain-guided encoder called Image2latent to project the image to StyleGAN2 latent space, which can reconstruct an input image with high-quality and maintain its semantic meaning well; 2) a noise optimization mechanism in which a set of noise vectors are used to capture the high-frequency details such as image edges, further improving image reconstruction quality; and 3) a mask for seamless image fusion and local style migration. We further propose a novel semantic alignment evaluation pipeline. It evaluates the semantic alignment with an inverse code by using different attribute boundaries. Extensive qualitative and quantitative comparisons show that DNI can capture rich semantic information and achieve a satisfactory image reconstruction. It can realize a variety of facial image manipulation tasks and outperform state of the art.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2021.3089905