Creating and Animating Subject-Specific Anatomical Models

Creating and animating subject‐specific anatomical models is traditionally a difficult process involving medical image segmentation, geometric corrections and the manual definition of kinematic parameters. In this paper, we introduce a novel template morphing algorithm that facilitates three‐dimensi...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computer graphics forum Ročník 29; číslo 8; s. 2340 - 2351
Hlavní autoři: Gilles, B., Revéret, L., Pai, D. K.
Médium: Journal Article
Jazyk:angličtina
Vydáno: Oxford, UK Blackwell Publishing Ltd 01.12.2010
Wiley
Témata:
ISSN:0167-7055, 1467-8659
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Creating and animating subject‐specific anatomical models is traditionally a difficult process involving medical image segmentation, geometric corrections and the manual definition of kinematic parameters. In this paper, we introduce a novel template morphing algorithm that facilitates three‐dimensional modelling and parameterization of skeletons. Target data can be either medical images or surfaces of the whole skeleton. We incorporate prior knowledge about bone shape, the feasible skeleton pose and the morphological variability in the population. This allows for noise reduction, bone separation and the transfer, from the template, of anatomical and kinematical information not present in the input data. Our approach treats both local and global deformations in successive regularization steps: smooth elastic deformations are represented by an as‐rigid‐as‐possible displacement field between the reference and current configuration of the template, whereas global and discontinuous displacements are estimated through a projection onto a statistical shape model and a new joint pose optimization scheme with joint limits.
Bibliografie:istex:A62CFD59A3A6308EE04FB53BF3285EA868BD78CF
ArticleID:CGF1718
ark:/67375/WNG-03V9ZKK2-Q
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2010.01718.x