Learning Physical Collaborative Robot Behaviors From Human Demonstrations

Robots are becoming safe and smart enough to work alongside people not only on manufacturing production lines, but also in spaces such as houses, museums, or hospitals. This can be significantly exploited in situations in which a human needs the help of another person to perform a task, because a ro...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on robotics Vol. 32; no. 3; pp. 513 - 527
Main Authors: Rozo, Leonel, Calinon, Sylvain, Caldwell, Darwin G., Jimenez, Pablo, Torras, Carme
Format: Journal Article
Language:English
Published: New York IEEE 01.06.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1552-3098, 1941-0468
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Robots are becoming safe and smart enough to work alongside people not only on manufacturing production lines, but also in spaces such as houses, museums, or hospitals. This can be significantly exploited in situations in which a human needs the help of another person to perform a task, because a robot may take the role of the helper. In this sense, a human and the robotic assistant may cooperatively carry out a variety of tasks, therefore requiring the robot to communicate with the person, understand his/her needs, and behave accordingly. To achieve this, we propose a framework for a user to teach a robot collaborative skills from demonstrations. We mainly focus on tasks involving physical contact with the user, in which not only position, but also force sensing and compliance become highly relevant. Specifically, we present an approach that combines probabilistic learning, dynamical systems, and stiffness estimation to encode the robot behavior along the task. Our method allows a robot to learn not only trajectory following skills, but also impedance behaviors. To show the functionality and flexibility of our approach, two different testbeds are used: a transportation task and a collaborative table assembly.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:1552-3098
1941-0468
DOI:10.1109/TRO.2016.2540623