Fast Payload Calibration for Sensorless Contact Estimation Using Model Pre-Training

Force and torque sensing is crucial in robotic manipulation across both collaborative and industrial settings. Traditional methods for dynamics identification enable the detection and control of external forces and torques without the need for costly sensors. However, these approaches show limitatio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters Jg. 9; H. 10; S. 9007 - 9014
Hauptverfasser: Shan, Shilin, Pham, Quang-Cuong
Format: Journal Article
Sprache:Englisch
Veröffentlicht: IEEE 01.10.2024
Schlagworte:
ISSN:2377-3766, 2377-3766
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Force and torque sensing is crucial in robotic manipulation across both collaborative and industrial settings. Traditional methods for dynamics identification enable the detection and control of external forces and torques without the need for costly sensors. However, these approaches show limitations in scenarios where robot dynamics, particularly the end-effector payload, are subject to changes. Moreover, existing calibration techniques face trade-offs between efficiency and accuracy due to concerns over joint space coverage. In this letter, we introduce a calibration scheme that leverages pre-trained Neural Network models to learn calibrated dynamics across a wide range of joint space in advance. This offline learning strategy significantly reduces the need for online data collection, whether for selection of the optimal model or identification of payload features, necessitating merely a 4-second trajectory for online calibration. This method is particularly effective in tasks that require frequent dynamics recalibration for precise contact estimation. We further demonstrate the efficacy of this approach through applications in sensorless joint and task compliance, accounting for payload variability.
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2024.3455800