Superpixel-based color–depth restoration and dynamic environment modeling for Kinect-assisted image-based rendering systems

Depth information is an important ingredient in many multiview applications including image-based rendering (IBR). With the advent of electronics, low-cost and high-speed depth cameras, such as the Microsoft Kinect, are getting increasingly popular. In this paper, we propose a superpixel-based joint...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:The Visual computer Ročník 34; číslo 1; s. 67 - 81
Hlavní autoři: Wang, Chong, Chan, Shing-Chow, Zhu, Zhen-Yu, Zhang, Li, Shum, Heung-Yeung
Médium: Journal Article
Jazyk:angličtina
Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.01.2018
Springer Nature B.V
Témata:
ISSN:0178-2789, 1432-2315
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Depth information is an important ingredient in many multiview applications including image-based rendering (IBR). With the advent of electronics, low-cost and high-speed depth cameras, such as the Microsoft Kinect, are getting increasingly popular. In this paper, we propose a superpixel-based joint color–depth restoration approach for Kinect depth camera and study its application to view synthesis in IBR systems. Thus, an edge-based matching method is proposed to reduce the color–depth registration errors. Then the Kinect depth map is restored based on probabilistic color–depth superpixels, probabilistic local polynomial regression and joint color–depth matting. The proposed restoration algorithm does not only inpaint the missing data, but also correct and refine the depth map to provide better color–depth consistency. Last but not the least, a dynamic background modeling scheme is proposed to address the disocclusion problem in the view synthesis for dynamic environment. The experimental results show the effectiveness of the proposed algorithm and system.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-016-1312-2