Progressive search space reduction for human pose estimation

The objective of this paper is to estimate 2D human pose as a spatial configuration of body parts in TV and movie video shots. Such video material is uncontrolled and extremely challenging. We propose an approach that progressively reduces the search space for body parts, to greatly improve the chan...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2008 IEEE Conference on Computer Vision and Pattern Recognition s. 1 - 8
Hlavní autoři: Ferrari, V., Marin-Jimenez, M., Zisserman, A.
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.06.2008
Témata:
ISBN:9781424422425, 1424422426
ISSN:1063-6919, 1063-6919
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The objective of this paper is to estimate 2D human pose as a spatial configuration of body parts in TV and movie video shots. Such video material is uncontrolled and extremely challenging. We propose an approach that progressively reduces the search space for body parts, to greatly improve the chances that pose estimation will succeed. This involves two contributions: (i) a generic detector using a weak model of pose to substantially reduce the full pose search space; and (ii) employing 'grabcut' initialized on detected regions proposed by the weak model, to further prune the search space. Moreover, we also propose (Hi) an integrated spatio- temporal model covering multiple frames to refine pose estimates from individual frames, with inference using belief propagation. The method is fully automatic and self-initializing, and explains the spatio-temporal volume covered by a person moving in a shot, by soft-labeling every pixel as belonging to a particular body part or to the background. We demonstrate upper-body pose estimation by an extensive evaluation over 70000 frames from four episodes of the TV series Buffy the vampire slayer, and present an application to full- body action recognition on the Weizmann dataset.
ISBN:9781424422425
1424422426
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2008.4587468