AMtrack: Anti-occlusion multi-object tracking algorithm

To address the problem of occlusion in multi-object tracking (MOT) scenarios, an anti-occlusion multi-object tracking (AMTrack) algorithm is proposed, which is applicable not only to low-viewpoint but also to high-viewpoint MOT. Firstly, an occlusion index (ONI) calculation method is given to quanti...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Signal, image and video processing Ročník 18; číslo 12; s. 9305 - 9318
Hlavní autoři: Liu, Zhigang, Huang, Xiaohang, Sun, Jianwei, Zhang, Xinchang
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Springer London 01.12.2024
Springer Nature B.V
Témata:
ISSN:1863-1703, 1863-1711
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:To address the problem of occlusion in multi-object tracking (MOT) scenarios, an anti-occlusion multi-object tracking (AMTrack) algorithm is proposed, which is applicable not only to low-viewpoint but also to high-viewpoint MOT. Firstly, an occlusion index (ONI) calculation method is given to quantify object occlusion, helping in measuring the extent of occlusion for each object. Secondly, an anti-occlusion multi-level association (AMLA) mechanism is designed for mitigating the impact of occlusion on data association by hierarchically associating objects containing different levels of noises, ensuring robust tracking in occluded environments. Additionally, an anti-noise trajectory feature update (ATFU) method is presented to reduce the noise from occlusion by selectively updating the trajectory feature. Finally, the experimental results on the MOT16, MOT17, MOT20 and Dancetrack test sets show that AMTrack exhibits state-of-the-art performance, compared with the classic methods such as OsaMOT, DeepSORT, RetinaMOT, TicrossNet, demonstrating its effectiveness and robustness in multi-object tracking scenarios with occlusion.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-024-03547-w