3D Lane Detection With Attention in Attention

Lane detection is a critical component of autonomous driving systems and has been the subject of extensive research. Unlike object detection, identifying car lanes requires extracting features from multi-scale information since they are slender, sparse, and distributed in the entire image. Unfortuna...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE signal processing letters Ročník 31; s. 1104 - 1108
Hlavní autoři: Gu, Yinchao, Ma, Chao, Li, Qian, Yang, Xiaokang
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1070-9908, 1558-2361
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Lane detection is a critical component of autonomous driving systems and has been the subject of extensive research. Unlike object detection, identifying car lanes requires extracting features from multi-scale information since they are slender, sparse, and distributed in the entire image. Unfortunately, many previous works have overlooked this requirement, resulting in less robust lane feature extraction. To address this issue, we propose an attention mechanism that is exceptional at extracting global and long-dependence lane features compared to CNNs. Our attention-based structure, called attention in attention , explores the relationship between various correlations and mitigates mismatched correlation problems in attention computation. Furthermore, we introduce a novel feature fusion structure in the backbone called the double feature pyramid network, which effectively gathers feature information with various dimensions and enlarges the receptive field. Our network is based on the BEV-LaneDet and achieves impressive performance on the OpenLane dataset. Notably, experiments demonstrate that our method surpasses BEV-LaneDet by 4.4% in terms of F-Score on OpenLane.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2024.3385334