Real-Time and Fully Automated Robotic Stacking System with Deep Learning-Based Visual Perception

Gespeichert in:
Bibliographische Detailangaben
Titel: Real-Time and Fully Automated Robotic Stacking System with Deep Learning-Based Visual Perception
Autoren: Ali Sait Ozer, Ilkay Cinar
Quelle: Sensors, Vol 25, Iss 22, p 6960 (2025)
Verlagsinformationen: MDPI AG, 2025.
Publikationsjahr: 2025
Bestand: LCC:Chemical technology
Schlagwörter: computer vision, industrial automation, programmable logic controller integration, real-time object detection, robotic stacking, smart manufacturing, Chemical technology, TP1-1185
Beschreibung: This study presents a fully automated, real-time robotic stacking system based on deep learning-driven visual perception, designed to optimize classification and handling tasks on industrial production lines. The proposed system integrates a YOLOv5s-based object detection algorithm with an ABB IRB6640 robotic arm via a programmable logic controller and the Profinet communication protocol. Using a camera mounted above a conveyor belt and a Python-based interface, 13 different types of industrial bags were classified and sorted. The trained model achieved a high validation performance with an mAP@0.5 score of 0.99 and demonstrated 99.08% classification accuracy in initial field tests. Following environmental and mechanical optimizations, such as adjustments to lighting, camera angle, and cylinder alignment, the system reached 100% operational accuracy during real-world applications involving 9600 packages over five days. With an average cycle time of 10–11 s, the system supports a processing capacity of up to six items per minute, exhibiting robustness, adaptability, and real-time performance. This integration of computer vision, robotics, and industrial automation offers a scalable solution for future smart manufacturing applications.
Publikationsart: article
Dateibeschreibung: electronic resource
Sprache: English
ISSN: 1424-8220
Relation: https://www.mdpi.com/1424-8220/25/22/6960; https://doaj.org/toc/1424-8220
DOI: 10.3390/s25226960
Zugangs-URL: https://doaj.org/article/64b45307582b4bdbbcb4a72215c32b11
Dokumentencode: edsdoj.64b45307582b4bdbbcb4a72215c32b11
Datenbank: Directory of Open Access Journals