Efficient real-time multi-object tracking algorithm for complex scenarios

In this paper, we propose an enhanced Multi-Object Tracking(MOT) framework based on ByteTrack, achieving dual improvements in efficiency and performance while significantly enhancing robustness and real time of the algorithm in complex scenarios. During the initial matching stage, we integrate an ap...

Full description

Saved in:
Bibliographic Details
Published in:Signal, image and video processing Vol. 19; no. 13; p. 1148
Main Authors: Li, Yufeng, An, Tianyang, Hu, Nairui, Wang, Haiyao
Format: Journal Article
Language:English
Published: London Springer London 01.12.2025
Springer Nature B.V
Subjects:
ISSN:1863-1703, 1863-1711
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose an enhanced Multi-Object Tracking(MOT) framework based on ByteTrack, achieving dual improvements in efficiency and performance while significantly enhancing robustness and real time of the algorithm in complex scenarios. During the initial matching stage, we integrate an appearance feature matching branch employing a Vmamba backbone network to mitigate occlusion-induced detection failures caused by significant appearance variations. Simultaneously, we introduce a computationally optimized appearance feature extraction method to reduce redundant computational overhead and improve resource utilization. Comprehensive evaluations demonstrate the framework’s effectiveness in high-density scenarios, achieving state-of-the-art performance on Multiple Object Tracking Challenge 2017(MOT17) test set with 80.9 Multiple Object Tracking Accuracy(MOTA), 79.6 Identity F1 Score(IDF1), and 64.4 Higher Order Tracking Accuracy(HOTA), while maintaining real-time processing at 26.6 FPS. The proposed method also demonstrates superior performance on the MOT20 (Multiple Object Tracking Challenge 2020) benchmark, particularly in preserving identity consistency under severe occlusion scenarios.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-025-04748-7