DELTA: Dense Depth from Events and LiDAR Using Transformer's Attention

Uložené v:
Podrobná bibliografia
Názov: DELTA: Dense Depth from Events and LiDAR Using Transformer's Attention
Autori: Brebion, Vincent, Moreau, Julien, Davoine, Franck
Prispievatelia: Brebion, Vincent
Zdroj: 2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). :4898-4907
Publication Status: Preprint
Informácie o vydavateľovi: IEEE, 2025.
Rok vydania: 2025
Predmety: FOS: Computer and information sciences, [INFO.INFO-CV] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV], Computer Vision and Pattern Recognition (cs.CV), Computer Science - Computer Vision and Pattern Recognition, I.4.8, [INFO.INFO-LG] Computer Science [cs]/Machine Learning [cs.LG]
Popis: Event cameras and LiDARs provide complementary yet distinct data: respectively, asynchronous detections of changes in lighting versus sparse but accurate depth information at a fixed rate. To this day, few works have explored the combination of these two modalities. In this article, we propose a novel neural-network-based method for fusing event and LiDAR data in order to estimate dense depth maps. Our architecture, DELTA, exploits the concepts of self- and cross-attention to model the spatial and temporal relations within and between the event and LiDAR data. Following a thorough evaluation, we demonstrate that DELTA sets a new state of the art in the event-based depth estimation problem, and that it is able to reduce the errors up to four times for close ranges compared to the previous SOTA.
Accepted for the CVPR 2025 Workshop on Event-based Vision. For the project page, see https://vbrebion.github.io/DELTA/
Druh dokumentu: Article
Conference object
Popis súboru: application/pdf
DOI: 10.1109/cvprw67362.2025.00482
DOI: 10.48550/arxiv.2505.02593
Prístupová URL adresa: http://arxiv.org/abs/2505.02593
https://hal.science/hal-05057148v1/document
https://hal.science/hal-05057148v1
Rights: STM Policy #29
CC BY
Prístupové číslo: edsair.doi.dedup.....fe92df14d688ca0dd17bc8007de5e8c9
Databáza: OpenAIRE
Popis
Abstrakt:Event cameras and LiDARs provide complementary yet distinct data: respectively, asynchronous detections of changes in lighting versus sparse but accurate depth information at a fixed rate. To this day, few works have explored the combination of these two modalities. In this article, we propose a novel neural-network-based method for fusing event and LiDAR data in order to estimate dense depth maps. Our architecture, DELTA, exploits the concepts of self- and cross-attention to model the spatial and temporal relations within and between the event and LiDAR data. Following a thorough evaluation, we demonstrate that DELTA sets a new state of the art in the event-based depth estimation problem, and that it is able to reduce the errors up to four times for close ranges compared to the previous SOTA.<br />Accepted for the CVPR 2025 Workshop on Event-based Vision. For the project page, see https://vbrebion.github.io/DELTA/
DOI:10.1109/cvprw67362.2025.00482