Superpixel-based color–depth restoration and dynamic environment modeling for Kinect-assisted image-based rendering systems

Depth information is an important ingredient in many multiview applications including image-based rendering (IBR). With the advent of electronics, low-cost and high-speed depth cameras, such as the Microsoft Kinect, are getting increasingly popular. In this paper, we propose a superpixel-based joint...

Full description

Saved in:
Bibliographic Details
Published in:The Visual computer Vol. 34; no. 1; pp. 67 - 81
Main Authors: Wang, Chong, Chan, Shing-Chow, Zhu, Zhen-Yu, Zhang, Li, Shum, Heung-Yeung
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.01.2018
Springer Nature B.V
Subjects:
ISSN:0178-2789, 1432-2315
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Depth information is an important ingredient in many multiview applications including image-based rendering (IBR). With the advent of electronics, low-cost and high-speed depth cameras, such as the Microsoft Kinect, are getting increasingly popular. In this paper, we propose a superpixel-based joint color–depth restoration approach for Kinect depth camera and study its application to view synthesis in IBR systems. Thus, an edge-based matching method is proposed to reduce the color–depth registration errors. Then the Kinect depth map is restored based on probabilistic color–depth superpixels, probabilistic local polynomial regression and joint color–depth matting. The proposed restoration algorithm does not only inpaint the missing data, but also correct and refine the depth map to provide better color–depth consistency. Last but not the least, a dynamic background modeling scheme is proposed to address the disocclusion problem in the view synthesis for dynamic environment. The experimental results show the effectiveness of the proposed algorithm and system.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-016-1312-2