Multiview-Video-Plus-Depth Coding Based on the Advanced Video Coding Standard

This paper presents a multiview-video-plus-depth coding scheme, which is compatible with the advanced video coding (H.264/AVC) standard and its multiview video coding (MVC) extension. This scheme introduces several encoding and in-loop coding tools for depth and texture video coding, such as depth-b...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on image processing Ročník 22; číslo 9; s. 3449 - 3458
Hlavní autori: Hannuksela, Miska M., Gabbouj, Moncef, Rusanovskyy, Dmytro, Su, Wenyi, Chen, Lulu, Li, Ri, Aflaki, Payman, Lan, Deyan, Joachimiak, Michal, Li, Houqiang
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.09.2013
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:This paper presents a multiview-video-plus-depth coding scheme, which is compatible with the advanced video coding (H.264/AVC) standard and its multiview video coding (MVC) extension. This scheme introduces several encoding and in-loop coding tools for depth and texture video coding, such as depth-based texture motion vector prediction, depth-range-based weighted prediction, joint inter-view depth filtering, and gradual view refresh. The presented coding scheme is submitted to the 3D video coding (3DV) call for proposals (CfP) of the Moving Picture Experts Group standardization committee. When measured with commonly used objective metrics against the MVC anchor, the proposed scheme provides an average bitrate reduction of 26% and 35% for the 3DV CfP test scenarios with two and three views, respectively. The observed bitrate reduction is similar according to an analysis of the results obtained for the subjective tests on the 3DV CfP submissions.
Bibliografia:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2013.2269274