Laser Curve Extraction of Wheelset Based on Deep Learning Skeleton Extraction Network

In this paper, a new algorithm for extracting the laser fringe center is proposed. Based on a deep learning skeleton extraction network, the laser stripe center can be extracted quickly and accurately. Skeleton extraction is the process of reducing the shape image to its approximate central axis rep...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors Jg. 22; H. 3; S. 859
Hauptverfasser: Luo, Shuai, Yang, Kai, Yang, Lijuan, Wang, Yong, Gao, Xiaorong, Jiang, Tianci, Li, Chunjiang
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Switzerland MDPI AG 23.01.2022
MDPI
Schlagworte:
ISSN:1424-8220, 1424-8220
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, a new algorithm for extracting the laser fringe center is proposed. Based on a deep learning skeleton extraction network, the laser stripe center can be extracted quickly and accurately. Skeleton extraction is the process of reducing the shape image to its approximate central axis representation while maintaining the image’s topological and geometric shape. Skeleton extraction is an important step in topological and geometric shape analysis. According to the characteristics of the wheelset laser curve dataset, a new skeleton extraction network, a hierarchical skeleton network (LuoNet), is proposed. The proposed architecture has three levels of the encoder–decoder network, and YE Module interconnection is designed between each level of the encoder and decoder network. In the wheelset laser curve dataset, the F1_score can reach 0.714. Compared with the traditional laser curve center extraction algorithm, the proposed LuoNet algorithm has the advantages of short running time, high accuracy, and stable extraction results.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s22030859