Generating Realistic Facial Expressions with Wrinkles for Model-Based Coding

Due to the limitations of current computer graphics technology mimicing realistic facial textures, such as wrinkles, is very difficult. Facial texture updating and compression are crucial to achieving realistic facial animation for low bit rate model-based coding. In this paper, we present a partial...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computer vision and image understanding Ročník 84; číslo 2; s. 201 - 240
Hlavní autoři: Yin, Lijun, Basu, Anup
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Inc 01.11.2001
Elsevier
Témata:
ISSN:1077-3142, 1090-235X
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Due to the limitations of current computer graphics technology mimicing realistic facial textures, such as wrinkles, is very difficult. Facial texture updating and compression are crucial to achieving realistic facial animation for low bit rate model-based coding. In this paper, we present a partial texture updating method for realistic facial expression synthesis with facial wrinkles. First, fiducial points on a face are estimated using a color-based deformable template matching method. Second, an extended dynamic mesh matching algorithm is developed for face tracking. Next, textures of interest (TOI) in the potential expressive wrinkles and mouth–eye texture areas are captured by the detected fiducial points. Among the TOI, the so-called active textures or expressive textures are extracted by exploring temporal correlation information. Finally, the entire facial texture is synthesized using the active texture. Compared to the entire texture updating scheme, partially updating and compressing facial textures significantly reduce the computational complexity and bit rates while still producing an acceptable visual quality. Experiments on the video sequences demonstrate the advantage of the proposed algorithm.
Bibliografie:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1077-3142
1090-235X
DOI:10.1006/cviu.2001.0949