Creating and Animating Subject-Specific Anatomical Models

Creating and animating subject‐specific anatomical models is traditionally a difficult process involving medical image segmentation, geometric corrections and the manual definition of kinematic parameters. In this paper, we introduce a novel template morphing algorithm that facilitates three‐dimensi...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 29; no. 8; pp. 2340 - 2351
Main Authors: Gilles, B., Revéret, L., Pai, D. K.
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing Ltd 01.12.2010
Wiley
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Creating and animating subject‐specific anatomical models is traditionally a difficult process involving medical image segmentation, geometric corrections and the manual definition of kinematic parameters. In this paper, we introduce a novel template morphing algorithm that facilitates three‐dimensional modelling and parameterization of skeletons. Target data can be either medical images or surfaces of the whole skeleton. We incorporate prior knowledge about bone shape, the feasible skeleton pose and the morphological variability in the population. This allows for noise reduction, bone separation and the transfer, from the template, of anatomical and kinematical information not present in the input data. Our approach treats both local and global deformations in successive regularization steps: smooth elastic deformations are represented by an as‐rigid‐as‐possible displacement field between the reference and current configuration of the template, whereas global and discontinuous displacements are estimated through a projection onto a statistical shape model and a new joint pose optimization scheme with joint limits.
Bibliography:istex:A62CFD59A3A6308EE04FB53BF3285EA868BD78CF
ArticleID:CGF1718
ark:/67375/WNG-03V9ZKK2-Q
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2010.01718.x