Vision Perception-based Adaptive Pushing Assisted Grasping Network for Dense Clutters

During the execution of a robotic grasping task, the task may fail due to the close proximity of multiple objects if grasping is the only motion primitive. Non-prehensile manipulations, such as pushing, can be used to rearrange objects and benefit grasping. Varying pushing actions with different spe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Chinese Control Conference S. 8411 - 8416
Hauptverfasser: Liu, Xinqi, Chai, Runqi, Wang, Shuo, Chai, Senchun, Xia, Yuanqing
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: Technical Committee on Control Theory, Chinese Association of Automation 28.07.2024
Schlagworte:
ISSN:1934-1768
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:During the execution of a robotic grasping task, the task may fail due to the close proximity of multiple objects if grasping is the only motion primitive. Non-prehensile manipulations, such as pushing, can be used to rearrange objects and benefit grasping. Varying pushing actions with different speeds, distances, and routines may result in better performance. In this study, we propose a vision perception-based Adaptive Pushing Assisted Grasping Network (APAGN) system for generating a sequence of actions that includes grasping and adaptive pushing. APAGN can perceive the scene and then predict the locations of objects after an adaptive push, which adjusts the force and direction of pushing based on expected performance. To achieve a more efficient calculation, an Action Selector of APAGN is designed to choose the object with the highest expected outcome before making a prediction. The value of pushing actions is estimated based on how they benefit grasping, which breaks the limitation of manually designed rewards. Simulations show that APAGN might achieve higher action efficiency than baseline methods, especially in cluttered environments.
ISSN:1934-1768
DOI:10.23919/CCC63176.2024.10662371