Pose‐to‐Motion: Cross‐Domain Motion Retargeting with Pose Prior
Creating plausible motions for a diverse range of characters is a long‐standing goal in computer graphics. Current learning‐based motion synthesis methods rely on large‐scale motion datasets, which are often difficult if not impossible to acquire. On the other hand, pose data is more accessible, sin...
Uložené v:
| Vydané v: | Computer graphics forum Ročník 43; číslo 8 |
|---|---|
| Hlavní autori: | , , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Oxford
Blackwell Publishing Ltd
01.12.2024
|
| Predmet: | |
| ISSN: | 0167-7055, 1467-8659 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | Creating plausible motions for a diverse range of characters is a long‐standing goal in computer graphics. Current learning‐based motion synthesis methods rely on large‐scale motion datasets, which are often difficult if not impossible to acquire. On the other hand, pose data is more accessible, since static posed characters are easier to create and can even be extracted from images using recent advancements in computer vision. In this paper, we tap into this alternative data source and introduce a neural motion synthesis approach through retargeting, which generates plausible motion of various characters that only have pose data by transferring motion from one single existing motion capture dataset of another drastically different characters. Our experiments show that our method effectively combines the motion features of the source character with the pose features of the target character, and performs robustly with small or noisy pose data sets, ranging from a few artist‐created poses to noisy poses estimated directly from images. Additionally, a conducted user study indicated that a majority of participants found our retargeted motion to be more enjoyable to watch, more lifelike in appearance, and exhibiting fewer artifacts. Our code and dataset can be accessed here. |
|---|---|
| Bibliografia: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0167-7055 1467-8659 |
| DOI: | 10.1111/cgf.15170 |