Vision-assisted control for manipulation using virtual fixtures

We present the design and implementation of a vision-based system for cooperative manipulation at millimeter to micrometer scales. The system is based on an admittance control algorithm that implements a broad class of guidance modes called virtual fixtures. A virtual fixture, like a real fixture, l...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on robotics Jg. 20; H. 6; S. 953 - 966
Hauptverfasser: Bettini, A., Marayong, P., Lang, S., Okamura, A.M., Hager, G.D.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York, NY IEEE 01.12.2004
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1552-3098, 1941-0468
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present the design and implementation of a vision-based system for cooperative manipulation at millimeter to micrometer scales. The system is based on an admittance control algorithm that implements a broad class of guidance modes called virtual fixtures. A virtual fixture, like a real fixture, limits the motion of a tool to a prescribed class or range of motions. We describe how both hard (unyielding) and soft (yielding) virtual fixtures can be implemented in this control framework. We then detail the construction of virtual fixtures for point positioning and curve following as well as extensions of these to tubes, cones, and sequences thereof. We also describe an implemented system using the JHU Steady Hand Robot. The system uses computer vision as a sensor for providing a reference trajectory, and the virtual fixture control algorithm then provides haptic feedback to implemented direct, shared manipulation. We provide extensive experimental results detailing both system performance and the effects of virtual fixtures on human speed and accuracy.
Bibliographie:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ISSN:1552-3098
1941-0468
DOI:10.1109/TRO.2004.829483