Unsupervised Deep Homography: A Fast and Robust Homography Estimation Model

Homography estimation between multiple aerial images can provide relative pose estimation for collaborative autonomous exploration and monitoring. The usage on a robotic system requires a fast and robust homography estimation algorithm. In this letter, we propose an unsupervised learning algorithm t...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics and automation letters Vol. 3; no. 3; pp. 2346 - 2353
Main Authors: Ty Nguyen, Chen, Steven W., Shivakumar, Shreyas S., Taylor, Camillo Jose, Kumar, Vijay
Format: Journal Article
Language:English
Published: Piscataway IEEE 01.07.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:2377-3766, 2377-3766
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Homography estimation between multiple aerial images can provide relative pose estimation for collaborative autonomous exploration and monitoring. The usage on a robotic system requires a fast and robust homography estimation algorithm. In this letter, we propose an unsupervised learning algorithm that trains a deep convolutional neural network to estimate planar homographies. We compare the proposed algorithm to traditional-feature-based and direct methods, as well as a corresponding supervised learning algorithm. Our empirical results demonstrate that compared to traditional approaches, the unsupervised algorithm achieves faster inference speed, while maintaining comparable or better accuracy and robustness to illumination variation. In addition, our unsupervised method has superior adaptability and performance compared to the corresponding supervised deep learning method. Our image dataset and a Tensorflow implementation of our work are available at https://github.com/tynguyen/unsupervisedDeepHomographyRAL2018.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2018.2809549