Encompass obstacle image detection method based on U-V disparity map and RANSAC algorithm

With the rapid development of autonomous driving technology, obstacle image detection has become an important problem that autonomous vehicles must solve. Obstacle image detection accuracy directly affects the safety and reliability of autonomous vehicles. Currently, these methods often face issues...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Scientific reports Ročník 15; číslo 1; s. 6164 - 18
Hlavný autor: Xu, Huiqiong
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: London Nature Publishing Group UK 20.02.2025
Nature Publishing Group
Nature Portfolio
Predmet:
ISSN:2045-2322, 2045-2322
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:With the rapid development of autonomous driving technology, obstacle image detection has become an important problem that autonomous vehicles must solve. Obstacle image detection accuracy directly affects the safety and reliability of autonomous vehicles. Currently, these methods often face issues such as sensitivity to lighting and weather conditions. In response to these problems, research has been conducted to combine U-V disparity maps for obstacle detection. This map is used for coarse filtering of non-road disparity and finding disparity coordinates and other information for each line segment in the disparity map based on projection information. Then, a random sampling consistency algorithm is combined to perform road line fitting and remove noise. Finally, a new obstacle image detection method is designed. The results showed that the classification loss value was 0.013, the generalized intersection to union ratio loss was 0.0072, the target loss converged to 0.0026, and the accuracy of the algorithm reached over 95%. The findings of this study offer novel insights into the advancement of obstacle image detection technology, with potential applications in autonomous driving and image recognition.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-025-89785-5