Multi-task Correlation Particle Filter for Robust Object Tracking

In this paper, we propose a multi-task correlation particle filter (MCPF) for robust visual tracking. We first present the multi-task correlation filter (MCF) that takes the interdependencies among different features into account to learn correlation filters jointly. The proposed MCPF is designed to...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) s. 4819 - 4827
Hlavní autori: Tianzhu Zhang, Changsheng Xu, Ming-Hsuan Yang
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 01.07.2017
Predmet:
ISSN:1063-6919, 1063-6919
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:In this paper, we propose a multi-task correlation particle filter (MCPF) for robust visual tracking. We first present the multi-task correlation filter (MCF) that takes the interdependencies among different features into account to learn correlation filters jointly. The proposed MCPF is designed to exploit and complement the strength of a MCF and a particle filter. Compared with existing tracking methods based on correlation filters and particle filters, the proposed tracker has several advantages. First, it can shepherd the sampled particles toward the modes of the target state distribution via the MCF, thereby resulting in robust tracking performance. Second, it can effectively handle large-scale variation via a particle sampling strategy. Third, it can effectively maintain multiple modes in the posterior density using fewer particles than conventional particle filters, thereby lowering the computational cost. Extensive experimental results on three benchmark datasets demonstrate that the proposed MCPF performs favorably against the state-of-the-art methods.
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2017.512