Multi-task Correlation Particle Filter for Robust Object Tracking

In this paper, we propose a multi-task correlation particle filter (MCPF) for robust visual tracking. We first present the multi-task correlation filter (MCF) that takes the interdependencies among different features into account to learn correlation filters jointly. The proposed MCPF is designed to...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) s. 4819 - 4827
Hlavní autoři: Tianzhu Zhang, Changsheng Xu, Ming-Hsuan Yang
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.07.2017
Témata:
ISSN:1063-6919, 1063-6919
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we propose a multi-task correlation particle filter (MCPF) for robust visual tracking. We first present the multi-task correlation filter (MCF) that takes the interdependencies among different features into account to learn correlation filters jointly. The proposed MCPF is designed to exploit and complement the strength of a MCF and a particle filter. Compared with existing tracking methods based on correlation filters and particle filters, the proposed tracker has several advantages. First, it can shepherd the sampled particles toward the modes of the target state distribution via the MCF, thereby resulting in robust tracking performance. Second, it can effectively handle large-scale variation via a particle sampling strategy. Third, it can effectively maintain multiple modes in the posterior density using fewer particles than conventional particle filters, thereby lowering the computational cost. Extensive experimental results on three benchmark datasets demonstrate that the proposed MCPF performs favorably against the state-of-the-art methods.
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2017.512