Learning Inverse Kinodynamics for Accurate High-Speed Off-Road Navigation on Unstructured Terrain

This letter presents a learning-based approach to consider the effect of unobservable world states in kinodynamic motion planning in order to enable accurate high-speed off-road navigation on unstructured terrain. Existing kinodynamic motion planners either operate in structured and homogeneous envi...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE robotics and automation letters Ročník 6; číslo 3; s. 6054 - 6060
Hlavní autoři: Xiao, Xuesu, Biswas, Joydeep, Stone, Peter
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 01.07.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2377-3766, 2377-3766
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This letter presents a learning-based approach to consider the effect of unobservable world states in kinodynamic motion planning in order to enable accurate high-speed off-road navigation on unstructured terrain. Existing kinodynamic motion planners either operate in structured and homogeneous environments and thus do not need to explicitly account for terrain-vehicle interaction, or assume a set of discrete terrain classes. However, when operating on unstructured terrain, especially at high speeds, even small variations in the environment will be magnified and cause inaccurate plan execution. In this letter, to capture the complex kinodynamic model and mathematically unknown world state, we learn a kinodynamic planner in a data-driven manner with onboard inertial observations. Our approach is tested on a physical robot in different indoor and outdoor environments, enables fast and accurate off-road navigation, and outperforms environment-independent alternatives, demonstrating 52.4% to 86.9% improvement in terms of plan execution success rate while traveling at high speeds.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2021.3090023