Gradient-Enhancing Conversion for Illumination-Robust Lane Detection
Lane detection is important in many advanced driver-assistance systems (ADAS). Vision-based lane detection algorithms are widely used and generally use gradient information as a lane feature. However, gradient values between lanes and roads vary with illumination change, which degrades the performan...
Uloženo v:
| Vydáno v: | IEEE transactions on intelligent transportation systems Ročník 14; číslo 3; s. 1083 - 1094 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
01.09.2013
|
| Témata: | |
| ISSN: | 1524-9050, 1558-0016 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Lane detection is important in many advanced driver-assistance systems (ADAS). Vision-based lane detection algorithms are widely used and generally use gradient information as a lane feature. However, gradient values between lanes and roads vary with illumination change, which degrades the performance of lane detection systems. In this paper, we propose a gradient-enhancing conversion method for illumination-robust lane detection. Our proposed gradient-enhancing conversion method produces a new gray-level image from an RGB color image based on linear discriminant analysis. The converted images have large gradients at lane boundaries. To deal with illumination changes, the gray-level conversion vector is dynamically updated. In addition, we propose a novel lane detection algorithm, which uses the proposed conversion method, adaptive Canny edge detector, Hough transform, and curve model fitting method. We performed several experiments in various illumination environments and confirmed that the gradient is maximized at lane boundaries on the road. The detection rate of the proposed lane detection algorithm averages 96% and is greater than 93% in very poor environments. |
|---|---|
| ISSN: | 1524-9050 1558-0016 |
| DOI: | 10.1109/TITS.2013.2252427 |