Overworld: Assessing the Geometry of the World for Human-Robot Interaction

For a robot to interact with humans in a given environment, a key need is to understand its environment in terms of the objects composing it, the other agents acting in it, and the relations between all of them. This capability is often called the geometrical situation assessment and is mainly relat...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE robotics and automation letters Ročník 8; číslo 3; s. 1874 - 1880
Hlavný autor: Sarthou, Guillaume
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Piscataway IEEE 01.03.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2377-3766, 2377-3766
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:For a robot to interact with humans in a given environment, a key need is to understand its environment in terms of the objects composing it, the other agents acting in it, and the relations between all of them. This capability is often called the geometrical situation assessment and is mainly related to spatial reasoning in time. In this letter, we present Overworld, a novel lightweight and open-source framework, merging the key features of a decade of research in the domain. It permanently maintains a geometric state of the world from the point of view of the robot by aggregating perceptual information from several sources and reasoning on them to create a coherent world. Furthermore, Overworld implements perspective-taking by emulating the humans' ability to perceive to estimate the state of the world from their perspective. Finally, thanks to a strong link with an ontology framework, it ensures knowledge coherence in the whole robotic architecture. This work is part of a broader effort to develop a complete, stable, and shareable decisional robotic architecture for Human-Robot Interaction.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2023.3238891