Enhancing Indoor Mobile Robot Localization through the Integration of Multi-Sensor Fusion Algorithms

This paper presents an innovative approach that combines visual odometry, Inertial Measurement Unit (IMU), and wheel odometry, using the Extended Kalman Filter and Unscented Kalman Filter to enhance mobile robot localization and mapping. The primary goal is to improve localization accuracy and robus...

Full description

Saved in:
Bibliographic Details
Published in:2024 1st International Conference on Robotics, Engineering, Science, and Technology (RESTCON) pp. 101 - 105
Main Authors: Yatigul, Rut, Pudchuen, Noppadol, Blattler, Aran, Jitviriya, Wisanu
Format: Conference Proceeding
Language:English
Published: IEEE 16.02.2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents an innovative approach that combines visual odometry, Inertial Measurement Unit (IMU), and wheel odometry, using the Extended Kalman Filter and Unscented Kalman Filter to enhance mobile robot localization and mapping. The primary goal is to improve localization accuracy and robustness while providing a cost-effective alternative to traditional SLAM methods that rely on expensive LIDAR and RGBD camera systems. By integrating visual odometry, IMU data, and wheel odometry our method not only enhances precision and robustness of the mobile robot localization system but also reduces the financial burden associated with high-end sensor hardware. This fusion creates a reliable solution, particularly suited for resource-constrained environments. Our research contributes to the democratization of SLAM technologies, making them more accessible to a wider range of applications. The results presented in this report showcase the potential of our approach. This paper highlights the pivotal role of multi-sensor fusion framework which significantly enhances localization accuracy and robustness.
DOI:10.1109/RESTCON60981.2024.10463559