Intelligent Multi-Sensor Data Fusion for Enhanced RADAR and Optical Imaging Applications Using Huffman Encoding

The growing demand for real-time object detection and tracking in autonomous systems, military surveillance, and wearable safety applications has highlighted significant challenges in sensor fusion, computational efficiency, and environmental adaptability. This paper presents a novel multi-sensor fu...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2025 IEEE Space, Aerospace and Defence Conference (SPACE) s. 1 - 6
Hlavní autoři: Devarajan, Anjali, Jain, Ashrith P, Goswami, Ashutosh, Vats, Ayush, V, Kiran
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 21.07.2025
Témata:
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The growing demand for real-time object detection and tracking in autonomous systems, military surveillance, and wearable safety applications has highlighted significant challenges in sensor fusion, computational efficiency, and environmental adaptability. This paper presents a novel multi-sensor fusion framework that integrates Frequency Modulated Continuous Wave (FMCW) radar, a YOLO-powered camera module, and GPS to enhance detection accuracy and robustness under diverse conditions. A key challenge in real-time vision-based systems is the computational overhead of high-resolution image processing, which limits deployment in resource-constrained embedded platforms. To address this, Huffman encoding is applied to the camera feed, reducing memory consumption and processing latency while preserving critical object features. Experimental results demonstrate that the proposed system achieves a 28.9% reduction in inference time and a 26% reduction in model size with minimal accuracy loss (0.6% mAP drop). By optimizing data fusion and compression techniques, this work provides a scalable and energy-efficient solution for modern Advanced Driver Assistance Systems (ADAS), battlefield situational awareness, and intelligent security monitoring, addressing key limitations in existing autonomous perception technologies.
DOI:10.1109/SPACE65882.2025.11170618