Human Pose Transfer by Adaptive Hierarchical Deformation

Human pose transfer, as a misaligned image generation task, is very challenging. Existing methods cannot effectively utilize the input information, which often fail to preserve the style and shape of hair and clothes. In this paper, we propose an adaptive human pose transfer network with two hierarc...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computer graphics forum Ročník 39; číslo 7; s. 325 - 337
Hlavní autoři: Zhang, Jinsong, Liu, Xingzi, Li, Kun
Médium: Journal Article
Jazyk:angličtina
Vydáno: Oxford Blackwell Publishing Ltd 01.10.2020
Témata:
ISSN:0167-7055, 1467-8659
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Human pose transfer, as a misaligned image generation task, is very challenging. Existing methods cannot effectively utilize the input information, which often fail to preserve the style and shape of hair and clothes. In this paper, we propose an adaptive human pose transfer network with two hierarchical deformation levels. The first level generates human semantic parsing aligned with the target pose, and the second level generates the final textured person image in the target pose with the semantic guidance. To avoid the drawback of vanilla convolution that treats all the pixels as valid information, we use gated convolution in both two levels to dynamically select the important features and adaptively deform the image layer by layer. Our model has very few parameters and is fast to converge. Experimental results demonstrate that our model achieves better performance with more consistent hair, face and clothes with fewer parameters than state‐of‐the‐art methods. Furthermore, our method can be applied to clothing texture transfer. The code is available for research purposes at https://github.com/Zhangjinso/PINet_PG.
Bibliografie:Contribute equally to this work.
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.14148