MIMOSA: A Multi-Modal SLAM Framework for Resilient Autonomy against Sensor Degradation

This paper presents a framework for Multi-Modal SLAM (MIMOSA) that utilizes a nonlinear factor graph as the underlying representation to provide loosely-coupled fusion of any number of sensing modalities. Tailored to the goal of enabling resilient robotic autonomy in GPS-denied and perceptually-degr...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 7153 - 7159
Main Authors: Khedekar, Nikhil, Kulkarni, Mihir, Alexis, Kostas
Format: Conference Proceeding
Language:English
Published: IEEE 23.10.2022
Subjects:
ISSN:2153-0866
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a framework for Multi-Modal SLAM (MIMOSA) that utilizes a nonlinear factor graph as the underlying representation to provide loosely-coupled fusion of any number of sensing modalities. Tailored to the goal of enabling resilient robotic autonomy in GPS-denied and perceptually-degraded environments, MIMOSA currently contains modules for pointcloud registration, fusion of multiple odometry estimates relying on visible-light and thermal vision, as well as inertial measurement propagation. A flexible back-end utilizes the estimates from various modalities as relative transformation factors. The method is designed to be robust to degeneracy through the maintenance and tracking of modality-specific health metrics, while also being inherently tolerant to sensor failure. We detail this framework alongside our implementation for handling high-rate asynchronous sensor measurements and evaluate its performance on data from autonomous subterranean robotic exploration missions using legged and aerial robots.
ISSN:2153-0866
DOI:10.1109/IROS47612.2022.9981108