Efficient real-time multi-object tracking algorithm for complex scenarios

In this paper, we propose an enhanced Multi-Object Tracking(MOT) framework based on ByteTrack, achieving dual improvements in efficiency and performance while significantly enhancing robustness and real time of the algorithm in complex scenarios. During the initial matching stage, we integrate an ap...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Signal, image and video processing Jg. 19; H. 13; S. 1148
Hauptverfasser: Li, Yufeng, An, Tianyang, Hu, Nairui, Wang, Haiyao
Format: Journal Article
Sprache:Englisch
Veröffentlicht: London Springer London 01.12.2025
Springer Nature B.V
Schlagworte:
ISSN:1863-1703, 1863-1711
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose an enhanced Multi-Object Tracking(MOT) framework based on ByteTrack, achieving dual improvements in efficiency and performance while significantly enhancing robustness and real time of the algorithm in complex scenarios. During the initial matching stage, we integrate an appearance feature matching branch employing a Vmamba backbone network to mitigate occlusion-induced detection failures caused by significant appearance variations. Simultaneously, we introduce a computationally optimized appearance feature extraction method to reduce redundant computational overhead and improve resource utilization. Comprehensive evaluations demonstrate the framework’s effectiveness in high-density scenarios, achieving state-of-the-art performance on Multiple Object Tracking Challenge 2017(MOT17) test set with 80.9 Multiple Object Tracking Accuracy(MOTA), 79.6 Identity F1 Score(IDF1), and 64.4 Higher Order Tracking Accuracy(HOTA), while maintaining real-time processing at 26.6 FPS. The proposed method also demonstrates superior performance on the MOT20 (Multiple Object Tracking Challenge 2020) benchmark, particularly in preserving identity consistency under severe occlusion scenarios.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-025-04748-7