Efficient real-time multi-object tracking algorithm for complex scenarios

In this paper, we propose an enhanced Multi-Object Tracking(MOT) framework based on ByteTrack, achieving dual improvements in efficiency and performance while significantly enhancing robustness and real time of the algorithm in complex scenarios. During the initial matching stage, we integrate an ap...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Signal, image and video processing Ročník 19; číslo 13; s. 1148
Hlavní autoři: Li, Yufeng, An, Tianyang, Hu, Nairui, Wang, Haiyao
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Springer London 01.12.2025
Springer Nature B.V
Témata:
ISSN:1863-1703, 1863-1711
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we propose an enhanced Multi-Object Tracking(MOT) framework based on ByteTrack, achieving dual improvements in efficiency and performance while significantly enhancing robustness and real time of the algorithm in complex scenarios. During the initial matching stage, we integrate an appearance feature matching branch employing a Vmamba backbone network to mitigate occlusion-induced detection failures caused by significant appearance variations. Simultaneously, we introduce a computationally optimized appearance feature extraction method to reduce redundant computational overhead and improve resource utilization. Comprehensive evaluations demonstrate the framework’s effectiveness in high-density scenarios, achieving state-of-the-art performance on Multiple Object Tracking Challenge 2017(MOT17) test set with 80.9 Multiple Object Tracking Accuracy(MOTA), 79.6 Identity F1 Score(IDF1), and 64.4 Higher Order Tracking Accuracy(HOTA), while maintaining real-time processing at 26.6 FPS. The proposed method also demonstrates superior performance on the MOT20 (Multiple Object Tracking Challenge 2020) benchmark, particularly in preserving identity consistency under severe occlusion scenarios.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-025-04748-7