PotNet: Pothole detection for autonomous vehicle system using convolutional neural network

Advancement in vision‐based techniques has enabled the autonomous vehicle system (AVS) to understand the driving scene in depth. The capability of autonomous vehicle system to understand the scene, and detecting the specific object depends on the strong feature representation of such objects. Howeve...

Full description

Saved in:
Bibliographic Details
Published in:Electronics letters Vol. 57; no. 2; pp. 53 - 56
Main Authors: Dewangan, Deepak Kumar, Sahu, Satya Prakash
Format: Journal Article
Language:English
Published: Stevenage John Wiley & Sons, Inc 01.01.2021
Wiley
Subjects:
ISSN:0013-5194, 1350-911X
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Advancement in vision‐based techniques has enabled the autonomous vehicle system (AVS) to understand the driving scene in depth. The capability of autonomous vehicle system to understand the scene, and detecting the specific object depends on the strong feature representation of such objects. However, pothole objects are difficult to identify due to their non‐uniform structure in challenging, and dynamic road environments. Existing approaches have shown limited performance for the precise detection of potholes. The study on the detection of potholes, and intelligent driving behaviour of autonomous vehicle system is little explored in existing articles. Hence, here, an improved prototype model, which is not only truly capable of detecting the potholes but also shows its intelligent driving behaviour when any pothole is detected, is proposed. The prototype is developed using a convolutional neural network with a vision camera to explore, and validates the potential, and autonomy of its driving behaviour in the prepared road environment. The experimental analysis of the proposed model on various performance measures have obtained accuracy, sensitivity, and F‐measure of 99.02%, 99.03%, and 98.33%, respectively, which are comparable with the available state‐of‐art techniques.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0013-5194
1350-911X
DOI:10.1049/ell2.12062