Statistical Modeling of Craniofacial Shape and Texture

We present a fully-automatic statistical 3D shape modeling approach and apply it to a large dataset of 3D images, the Headspace dataset , thus generating the first public shape-and-texture 3D morphable model (3DMM) of the full human head. Our approach is the first to employ a template that adapts to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of computer vision Jg. 128; H. 2; S. 547 - 571
Hauptverfasser: Dai, Hang, Pears, Nick, Smith, William, Duncan, Christian
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York Springer US 01.02.2020
Springer
Springer Nature B.V
Schlagworte:
ISSN:0920-5691, 1573-1405
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present a fully-automatic statistical 3D shape modeling approach and apply it to a large dataset of 3D images, the Headspace dataset , thus generating the first public shape-and-texture 3D morphable model (3DMM) of the full human head. Our approach is the first to employ a template that adapts to the dataset subject before dense morphing. This is fully automatic and achieved using 2D facial landmarking, projection to 3D shape, and mesh editing. In dense template morphing, we improve on the well-known Coherent Point Drift algorithm, by incorporating iterative data-sampling and alignment. Our evaluations demonstrate that our method has better performance in correspondence accuracy and modeling ability when compared with other competing algorithms. We propose a texture map refinement scheme to build high quality texture maps and texture model. We present several applications that include the first clinical use of craniofacial 3DMMs in the assessment of different types of surgical intervention applied to a craniosynostosis patient group.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-019-01260-7