Generating Upper‐Body Motion for Real‐Time Characters Making their Way through Dynamic Environments

Real‐time character animation in dynamic environments requires the generation of plausible upper‐body movements regardless of the nature of the environment, including non‐rigid obstacles such as vegetation. We propose a flexible model for upper‐body interactions, based on the anticipation of the cha...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computer graphics forum Ročník 41; číslo 8; s. 169 - 181
Hlavní autoři: Alvarado, Eduardo, Rohmer, Damien, Cani, Marie‐Paule
Médium: Journal Article
Jazyk:angličtina
Vydáno: Oxford Blackwell Publishing Ltd 01.12.2022
Wiley
Témata:
ISSN:0167-7055, 1467-8659
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Real‐time character animation in dynamic environments requires the generation of plausible upper‐body movements regardless of the nature of the environment, including non‐rigid obstacles such as vegetation. We propose a flexible model for upper‐body interactions, based on the anticipation of the character's surroundings, and on antagonistic controllers to adapt the amount of muscular stiffness and response time to better deal with obstacles. Our solution relies on a hybrid method for character animation that couples a keyframe sequence with kinematic constraints and lightweight physics. The dynamic response of the character's upper‐limbs leverages antagonistic controllers, allowing us to tune tension/relaxation in the upper‐body without diverging from the reference keyframe motion. A new sight model, controlled by procedural rules, enables high‐level authoring of the way the character generates interactions by adapting its stiffness and reaction time. As results show, our real‐time method offers precise and explicit control over the character's behavior and style, while seamlessly adapting to new situations. Our model is therefore well suited for gaming applications.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.14633