Enhancing Indoor Mobile Robot Localization through the Integration of Multi-Sensor Fusion Algorithms
This paper presents an innovative approach that combines visual odometry, Inertial Measurement Unit (IMU), and wheel odometry, using the Extended Kalman Filter and Unscented Kalman Filter to enhance mobile robot localization and mapping. The primary goal is to improve localization accuracy and robus...
Uložené v:
| Vydané v: | 2024 1st International Conference on Robotics, Engineering, Science, and Technology (RESTCON) s. 101 - 105 |
|---|---|
| Hlavní autori: | , , , |
| Médium: | Konferenčný príspevok.. |
| Jazyk: | English |
| Vydavateľské údaje: |
IEEE
16.02.2024
|
| Predmet: | |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | This paper presents an innovative approach that combines visual odometry, Inertial Measurement Unit (IMU), and wheel odometry, using the Extended Kalman Filter and Unscented Kalman Filter to enhance mobile robot localization and mapping. The primary goal is to improve localization accuracy and robustness while providing a cost-effective alternative to traditional SLAM methods that rely on expensive LIDAR and RGBD camera systems. By integrating visual odometry, IMU data, and wheel odometry our method not only enhances precision and robustness of the mobile robot localization system but also reduces the financial burden associated with high-end sensor hardware. This fusion creates a reliable solution, particularly suited for resource-constrained environments. Our research contributes to the democratization of SLAM technologies, making them more accessible to a wider range of applications. The results presented in this report showcase the potential of our approach. This paper highlights the pivotal role of multi-sensor fusion framework which significantly enhances localization accuracy and robustness. |
|---|---|
| DOI: | 10.1109/RESTCON60981.2024.10463559 |