Integration of Artificial Vision and Image Processing into a Pick and Place Collaborative Robotic System

In the field of robotics, pick and place applications are becoming increasingly popular due to their ability to automate repetitive tasks that can create temporary or permanent injuries. To enhance the efficiency of these applications, object recognition using a fixed camera or one mounted on a robo...

Full description

Saved in:
Bibliographic Details
Published in:Journal of intelligent & robotic systems Vol. 110; no. 4; p. 159
Main Authors: Santos, Adriano A., Schreurs, Cas, da Silva, António Ferreira, Pereira, Filipe, Felgueiras, Carlos, Lopes, António M., Machado, José
Format: Journal Article
Language:English
Published: Dordrecht Springer Netherlands 08.11.2024
Springer Nature B.V
Subjects:
ISSN:1573-0409, 0921-0296, 1573-0409
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the field of robotics, pick and place applications are becoming increasingly popular due to their ability to automate repetitive tasks that can create temporary or permanent injuries. To enhance the efficiency of these applications, object recognition using a fixed camera or one mounted on a robotic hand has been employed. This paper explores the possibilities of implementing a low-cost camera into a collaborative robotic system. A software architecture has been developed, including modules for perception, pick and place, and part transfer. A comprehensive overview of various intuitive drag-and-drop image processing technologies and their suitability for object recognition in a robotic context is provided. The challenges related to lighting and the effect of shadows in object recognition are discussed. A critical assessment is made of the architecture development platform as well as the study and the results are performed, and the effectiveness of the proposed solution based on the Niop architecture is verified.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1573-0409
0921-0296
1573-0409
DOI:10.1007/s10846-024-02195-z