Abstracting perception and manipulation in end-user robot programming using Sikuli

We propose a programming paradigm for robotics that has the potential to drastically facilitate robotic programming. Building up on Sikuli, a GUI automation language, we abstract specific robotic perception and control capabilities into first-class objects that are embedded in a simple scripting lan...

Full description

Saved in:
Bibliographic Details
Published in:IEEE International Conference on Technologies for Practical Robot Applications (Print) pp. 1 - 6
Main Authors: Kasper, Michael, Correll, Nikolaus, Yeh, Tom
Format: Conference Proceeding
Language:English
Published: IEEE 01.04.2014
Subjects:
ISSN:2325-0526
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a programming paradigm for robotics that has the potential to drastically facilitate robotic programming. Building up on Sikuli, a GUI automation language, we abstract specific robotic perception and control capabilities into first-class objects that are embedded in a simple scripting language. Currently, robotics programming requires a deep understanding of perception, controls and algorithms, knowledge of a specific robot's perception capabilities and kinematics, and finally a substantial amount of software engineering. Although learn-by-demonstration allows also relatively unskilled users to adapt a robot to their needs, this approach is intrinsically limited by the complexity such a program can reach. This paper presents a proof-of-concept for migrating Sikuli from the virtual GUI workspace of computer software to the physical 3D workspace of robotics. It then presents an example use case that illustrates the power of this new approach using a simple script that arranges a set of randomly aligned blocks into a tower using a Baxter robot equipped with an Asus Xtion Pro.
ISSN:2325-0526
DOI:10.1109/TePRA.2014.6869156