Inverse Compositional Estimation of 3D Pose And Lighting in Dynamic Scenes

In this paper, we show how we can estimate, accurately and efficiently, the 3D motion of a rigid object and time-varying lighting in a dynamic scene. This is achieved in an inverse compositional tracking framework with a novel warping function that involves a 2D rarr 3D rarr 2D transformation. This...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on pattern analysis and machine intelligence Vol. 30; no. 7; pp. 1300 - 1307
Main Authors: Xu, Yilei, Roy-Chowdhury, Amit
Format: Journal Article
Language:English
Published: Los Alamitos, CA IEEE 01.07.2008
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:0162-8828, 1939-3539
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we show how we can estimate, accurately and efficiently, the 3D motion of a rigid object and time-varying lighting in a dynamic scene. This is achieved in an inverse compositional tracking framework with a novel warping function that involves a 2D rarr 3D rarr 2D transformation. This also allows us to extend traditional two-frame inverse compositional tracking to a sequence of frames, leading to even higher computational savings. We prove the theoretical convergence of this method and show that it leads to significant reduction in computational burden. Experimental analysis on multiple video sequences shows impressive speedup over existing methods while retaining a high level of accuracy.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0162-8828
1939-3539
DOI:10.1109/TPAMI.2008.81