A feature-extraction localization algorithm research for teaching-free automated robotic welding based on 3D point cloud

The persistent challenge of discrepancies between virtual environments and actual working conditions in offline programming for robotic welding continues to impede practical implementation effectiveness. This research presents a feature-extraction localization technology for teaching-free automated...

Full description

Saved in:
Bibliographic Details
Published in:International journal of advanced manufacturing technology Vol. 138; no. 11; pp. 5397 - 5412
Main Authors: Li, Yiheng, Xu, Yanling, Wang, Xinghua, Ma, Xiaoyang, Wang, Qiang, Zhang, Huajun
Format: Journal Article
Language:English
Published: London Springer London 01.06.2025
Springer Nature B.V
Subjects:
ISSN:0268-3768, 1433-3015
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The persistent challenge of discrepancies between virtual environments and actual working conditions in offline programming for robotic welding continues to impede practical implementation effectiveness. This research presents a feature-extraction localization technology for teaching-free automated robotic welding based on a 3D vision sensing system. Our approach employs 3D vision sensors to capture the actual workpiece’s point cloud, which is then registered with the standard 3D digital model or used to extract the mathematical model of welds. Subsequently, these registration or extraction outcomes are used to adjust the offline programming trajectory and obtain the precise trajectory on the workpiece. The core technological innovations include the 3D reconstruction of the workpiece, initial localization of point cloud registration through the FPFH-RANSAC-ICP algorithm, and welding seam localization based on point cloud segmentation and feature extraction. Experimental validation on T-pipe, triplanar fillet, and V-groove butt weld configurations demonstrates the algorithm’s efficiency and accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0268-3768
1433-3015
DOI:10.1007/s00170-025-15797-0