Real-Time Intensity-Image Reconstruction for Event Cameras Using Manifold Regularisation

Event cameras or neuromorphic cameras mimic the human perception system as they measure the per-pixel intensity change rather than the actual intensity level . In contrast to traditional cameras, such cameras capture new information about the scene at MHz frequency in the form of sparse events. The...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:International journal of computer vision Ročník 126; číslo 12; s. 1381 - 1393
Hlavní autoři: Munda, Gottfried, Reinbacher, Christian, Pock, Thomas
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.12.2018
Springer
Springer Nature B.V
Témata:
ISSN:0920-5691, 1573-1405
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Event cameras or neuromorphic cameras mimic the human perception system as they measure the per-pixel intensity change rather than the actual intensity level . In contrast to traditional cameras, such cameras capture new information about the scene at MHz frequency in the form of sparse events. The high temporal resolution comes at the cost of losing the familiar per-pixel intensity information. In this work we propose a variational model that accurately models the behaviour of event cameras, enabling reconstruction of intensity images with arbitrary frame rate in real-time. Our method is formulated on a per-event-basis, where we explicitly incorporate information about the asynchronous nature of events via an event manifold induced by the relative timestamps of events. In our experiments we verify that solving the variational model on the manifold produces high-quality images without explicitly estimating optical flow. This paper is an extended version of our previous work (Reinbacher et al. in British machine vision conference (BMVC), 2016 ) and contains additional details of the variational model, an investigation of different data terms and a quantitative evaluation of our method against competing methods as well as synthetic ground-truth data.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-018-1106-2