3D Lane Detection With Attention in Attention

Lane detection is a critical component of autonomous driving systems and has been the subject of extensive research. Unlike object detection, identifying car lanes requires extracting features from multi-scale information since they are slender, sparse, and distributed in the entire image. Unfortuna...

Full description

Saved in:
Bibliographic Details
Published in:IEEE signal processing letters Vol. 31; pp. 1104 - 1108
Main Authors: Gu, Yinchao, Ma, Chao, Li, Qian, Yang, Xiaokang
Format: Journal Article
Language:English
Published: New York IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1070-9908, 1558-2361
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Lane detection is a critical component of autonomous driving systems and has been the subject of extensive research. Unlike object detection, identifying car lanes requires extracting features from multi-scale information since they are slender, sparse, and distributed in the entire image. Unfortunately, many previous works have overlooked this requirement, resulting in less robust lane feature extraction. To address this issue, we propose an attention mechanism that is exceptional at extracting global and long-dependence lane features compared to CNNs. Our attention-based structure, called attention in attention , explores the relationship between various correlations and mitigates mismatched correlation problems in attention computation. Furthermore, we introduce a novel feature fusion structure in the backbone called the double feature pyramid network, which effectively gathers feature information with various dimensions and enlarges the receptive field. Our network is based on the BEV-LaneDet and achieves impressive performance on the OpenLane dataset. Notably, experiments demonstrate that our method surpasses BEV-LaneDet by 4.4% in terms of F-Score on OpenLane.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2024.3385334