Pavement distress detection using convolutional neural networks with images captured via UAV

Pavement distress detection is crucial in the decision-making for maintenance planning. Unmanned aerial vehicles (UAVs) are helpful in collecting pavement images. This paper proposes the collection of pavement distress information using a UAV with a high-resolution camera. A UAV platform for pavemen...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Automation in construction Jg. 133; S. 103991
Hauptverfasser: Zhu, Junqing, Zhong, Jingtao, Ma, Tao, Huang, Xiaoming, Zhang, Weiguang, Zhou, Yang
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Amsterdam Elsevier B.V 01.01.2022
Elsevier BV
Schlagworte:
ISSN:0926-5805, 1872-7891
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Pavement distress detection is crucial in the decision-making for maintenance planning. Unmanned aerial vehicles (UAVs) are helpful in collecting pavement images. This paper proposes the collection of pavement distress information using a UAV with a high-resolution camera. A UAV platform for pavement image collection was assembled, and the flight settings were studied for optimal image quality. The collected images were processed and annotated for model training. Three state-of-the-art object-detection algorithms—Faster R-CNN, YOLOv3, and YOLOv4, were used to train the dataset, and their prediction performances were compared. A pavement image dataset was established with six types of distress. YOLOv3 demonstrated the best performance of the three algorithms, with a mean average precision (MAP) of 56.6%. The findings of this study assist in the inspection of non-destructive automatic pavement conditions. •UAV flight parameters are examined for pavement image collection.•An UAV pavement image dataset (UAPD) was established.•Anchor size is researched for pavement distress detection.•YOLOv3 outperforms YOLOv4 and Faster R-CNN.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0926-5805
1872-7891
DOI:10.1016/j.autcon.2021.103991