Gesture-Based Extraction of Robot Skill Parameters for Intuitive Robot Programming

Despite a lot of research in the field, only very little experience exists with Teaching by Demonstration (TbD) in actual industrial use cases. In the factory of the future, it is necessary to rapidly reprogram flexible mobile manipulators to perform new tasks, when the need arises, for which a work...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of intelligent & robotic systems Ročník 80; číslo Suppl 1; s. 149 - 163
Hlavní autoři: Pedersen, Mikkel Rath, Krüger, Volker
Médium: Journal Article
Jazyk:angličtina
Vydáno: Dordrecht Springer Netherlands 01.12.2015
Springer Nature B.V
Témata:
ISSN:0921-0296, 1573-0409
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Despite a lot of research in the field, only very little experience exists with Teaching by Demonstration (TbD) in actual industrial use cases. In the factory of the future, it is necessary to rapidly reprogram flexible mobile manipulators to perform new tasks, when the need arises, for which a working system capable of TbD would be ideal. Contrary to current TbD approaches, that generally aim to recognize both action and where it is applied, we propose a division of labor, where the operator manually specifies the action the robot should perform, while gestures are used for specifying the relevant action parameter (e.g. on which object to apply the action). Using this two-step method has the advantages that there is no uncertainty of which action the robot will perform, it takes into account that the environment changes, so objects do not need to be at predefined locations, and the parameter specification is possible even for inexperienced users. Experiments with 24 people in 3 different environments verify that it is indeed intuitive, even for a robotics novice, to program a mobile manipulator using this method.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0921-0296
1573-0409
DOI:10.1007/s10846-015-0219-x