SIR-SLAM: A Robust and Efficient Visual-Inertial Odometry System with IMU-RANSAC and Smooth Non-linear Optimization

Visual–inertial odometry (VIO) fuses camera and inertial measurements to enable real-time state estimation and map reconstruction. However, in low-texture scenes, under abrupt illumination changes, or in highly dynamic environments, conventional VIO pipelines often suffer from degraded accuracy and...

Full description

Saved in:
Bibliographic Details
Published in:Journal of physics. Conference series Vol. 3055; no. 1; pp. 12021 - 12030
Main Authors: Liu, Zican, Hu, Zhuhua, Zhao, Yaochi
Format: Journal Article
Language:English
Published: Bristol IOP Publishing 01.07.2025
Subjects:
ISSN:1742-6588, 1742-6596
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Visual–inertial odometry (VIO) fuses camera and inertial measurements to enable real-time state estimation and map reconstruction. However, in low-texture scenes, under abrupt illumination changes, or in highly dynamic environments, conventional VIO pipelines often suffer from degraded accuracy and poor real-time performance. To address these challenges, a SIR-SLAM with enhanced VIO framework is presented that integrates an inertial–guided RANSAC (IMU-RANSAC) front-end with a differentiable Levenberg–Marquardt (D-LM) back-end optimizer. IMU-RANSAC leverages inertial priors to identify and discard outlier feature correspondences, thereby reinforcing tracking robustness. The proposed D-LM algorithm introduces a smooth, differentiable trust-region adjustment strategy, which stabilizes dampingfactor updates and accelerates convergence of the non-linear bundle adjustment. Extensive experiments on the EuRoC benchmark demonstrate that SIR-SLAM consistently outperforms state-of-the-art baselines in terms of trajectory accuracy, runtime throughput, and inlier-matching ratio. In particular, the system exhibits superior adaptability and robustness in sequences characterized by aggressive motions and severe illumination variations.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/3055/1/012021