Generation of GelSight Tactile Images for Sim2Real Learning

Most current works in Sim2Real learning for robotic manipulation tasks leverage camera vision that may be significantly occluded by robot hands during the manipulation. Tactile sensing offers complementary information to vision and can compensate for the information loss caused by the occlusions. Ho...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE robotics and automation letters Ročník 6; číslo 2; s. 4177 - 4184
Hlavní autori: Gomes, Daniel Fernandes, Paoletti, Paolo, Luo, Shan
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Piscataway IEEE 01.04.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2377-3766, 2377-3766
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Most current works in Sim2Real learning for robotic manipulation tasks leverage camera vision that may be significantly occluded by robot hands during the manipulation. Tactile sensing offers complementary information to vision and can compensate for the information loss caused by the occlusions. However, the use of tactile sensing is restricted in the Sim2Real research due to no simulated tactile sensors being available. To mitigate the gap, we introduce a novel approach for simulating a GelSight tactile sensor in the commonly used Gazebo simulator. Similar to the real GelSight sensor, the simulated sensor can produce high-resolution images from depth-maps captured by a simulated optical sensor, and reconstruct the interaction between the touched object and an opaque soft membrane. It can indirectly sense forces, geometry, texture and other properties of the object and enables Sim2Real learning with tactile sensing. Preliminary experimental results have shown that the simulated sensor could generate realistic outputs similar to the ones captured by a real GelSight sensor. All the materials used in this letter are available at https://danfergo.github.io/gelsight-simulation .
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2021.3063925