A self-supervised method of single-image depth estimation by feeding forward information using max-pooling layers

We propose an encoder–decoder CNN framework to predict depth from one single image in a self-supervised manner. To this aim, we design three kinds of encoder based on the recent advanced deep neural network and one kind of decoder which can generate multiscale predictions. Eight loss functions are d...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:The Visual computer Ročník 37; číslo 4; s. 815 - 829
Hlavní autoři: Shi, Jinlong, Sun, Yunhan, Bai, Suqin, Sun, Zhengxing, Tian, Zhaohui
Médium: Journal Article
Jazyk:angličtina
Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.04.2021
Springer Nature B.V
Témata:
ISSN:0178-2789, 1432-2315
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We propose an encoder–decoder CNN framework to predict depth from one single image in a self-supervised manner. To this aim, we design three kinds of encoder based on the recent advanced deep neural network and one kind of decoder which can generate multiscale predictions. Eight loss functions are designed based on the proposed encoder–decoder CNN framework to validate the performance. For training, we take rectified stereo image pairs as input of the CNN, which is trained by reconstructing image via learning multiscale disparity maps. For testing, the CNN can estimate the accurate depth information by inputting only one single image. We validate our framework on two public datasets in contrast to the state-of-the-art methods and our designed different variants, and the performance of different encoder–decoder architectures and loss functions is evaluated to obtain the best combination, which proves that our proposed method performs very well for single-image depth estimation without the supervision of ground truth.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-020-01832-6