Enhancing Indoor Mobile Robot Localization through the Integration of Multi-Sensor Fusion Algorithms

This paper presents an innovative approach that combines visual odometry, Inertial Measurement Unit (IMU), and wheel odometry, using the Extended Kalman Filter and Unscented Kalman Filter to enhance mobile robot localization and mapping. The primary goal is to improve localization accuracy and robus...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2024 1st International Conference on Robotics, Engineering, Science, and Technology (RESTCON) S. 101 - 105
Hauptverfasser: Yatigul, Rut, Pudchuen, Noppadol, Blattler, Aran, Jitviriya, Wisanu
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 16.02.2024
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents an innovative approach that combines visual odometry, Inertial Measurement Unit (IMU), and wheel odometry, using the Extended Kalman Filter and Unscented Kalman Filter to enhance mobile robot localization and mapping. The primary goal is to improve localization accuracy and robustness while providing a cost-effective alternative to traditional SLAM methods that rely on expensive LIDAR and RGBD camera systems. By integrating visual odometry, IMU data, and wheel odometry our method not only enhances precision and robustness of the mobile robot localization system but also reduces the financial burden associated with high-end sensor hardware. This fusion creates a reliable solution, particularly suited for resource-constrained environments. Our research contributes to the democratization of SLAM technologies, making them more accessible to a wider range of applications. The results presented in this report showcase the potential of our approach. This paper highlights the pivotal role of multi-sensor fusion framework which significantly enhances localization accuracy and robustness.
DOI:10.1109/RESTCON60981.2024.10463559