Long-Range Motion Trajectories Extraction of Articulated Human Using Mesh Evolution

This letter presents a novel approach to extract reliable dense and long-range motion trajectories of articulated human in a video sequence. Compared with existing approaches that emphasize temporal consistency of each tracked point, we also consider the spatial structure of tracked points on the ar...

Full description

Saved in:
Bibliographic Details
Published in:IEEE signal processing letters Vol. 23; no. 4; pp. 507 - 511
Main Authors: Yuanyuan Wu, Xiaohai He, Byeongkeun Kang, Haiying Song, Nguyen, Truong Q.
Format: Journal Article
Language:English
Published: New York IEEE 01.04.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1070-9908, 1558-2361
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This letter presents a novel approach to extract reliable dense and long-range motion trajectories of articulated human in a video sequence. Compared with existing approaches that emphasize temporal consistency of each tracked point, we also consider the spatial structure of tracked points on the articulated human. We treat points as a set of vertices, and build a triangle mesh to join them in image space. The problem of extracting long-range motion trajectories is changed to the issue of consistency of mesh evolution over time. First, self-occlusion is detected by a novel mesh-based method and an adaptive motion estimation method is proposed to initialize mesh between successive frames. Furthermore, we propose an iterative algorithm to efficiently adjust vertices of mesh for a physically plausible deformation, which can meet the local rigidity of mesh and silhouette constraints. Finally, we compare the proposed method with the state-of-the-art methods on a set of challenging sequences. Evaluations demonstrate that our method achieves favorable performance in terms of both accuracy and integrity of extracted trajectories.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2016.2536647