Vision-assisted control for manipulation using virtual fixtures

We present the design and implementation of a vision-based system for cooperative manipulation at millimeter to micrometer scales. The system is based on an admittance control algorithm that implements a broad class of guidance modes called virtual fixtures. A virtual fixture, like a real fixture, l...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on robotics Ročník 20; číslo 6; s. 953 - 966
Hlavní autoři: Bettini, A., Marayong, P., Lang, S., Okamura, A.M., Hager, G.D.
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York, NY IEEE 01.12.2004
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1552-3098, 1941-0468
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We present the design and implementation of a vision-based system for cooperative manipulation at millimeter to micrometer scales. The system is based on an admittance control algorithm that implements a broad class of guidance modes called virtual fixtures. A virtual fixture, like a real fixture, limits the motion of a tool to a prescribed class or range of motions. We describe how both hard (unyielding) and soft (yielding) virtual fixtures can be implemented in this control framework. We then detail the construction of virtual fixtures for point positioning and curve following as well as extensions of these to tubes, cones, and sequences thereof. We also describe an implemented system using the JHU Steady Hand Robot. The system uses computer vision as a sensor for providing a reference trajectory, and the virtual fixture control algorithm then provides haptic feedback to implemented direct, shared manipulation. We provide extensive experimental results detailing both system performance and the effects of virtual fixtures on human speed and accuracy.
Bibliografie:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ISSN:1552-3098
1941-0468
DOI:10.1109/TRO.2004.829483