Multiview Image Coding Based on Geometric Prediction

Many existing multiview image/video coding techniques remove inter-viewpoint redundancy by applying disparity compensation in a conventional video coding framework, e.g., H.264/MPEG-4 AVC. However, conventional methodology works ineffectively as it ignores the special characteristics of inter-viewpo...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on circuits and systems for video technology Ročník 17; číslo 11; s. 1536 - 1548
Hlavní autoři: San, Xing, Cai, Hua, Lou, Jian-Guang, Li, Jiang
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 01.11.2007
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1051-8215, 1558-2205
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Many existing multiview image/video coding techniques remove inter-viewpoint redundancy by applying disparity compensation in a conventional video coding framework, e.g., H.264/MPEG-4 AVC. However, conventional methodology works ineffectively as it ignores the special characteristics of inter-viewpoint disparity. In this paper, we propose a geometric prediction methodology for accurate disparity vector (DV) prediction, such that we can largely reduce the disparity compensation cost. Based on the new DV predictor, we design a basic framework that can be implemented in most existing multiview image/video coding schemes. We also use state-of-the-art H.264/MPEG-4 AVC as an example to illustrate how the proposed framework can be integrated with conventional video coding algorithms. Our experiments show proposed scheme can effectively tracks disparity and greatly improves coding performance. Compared with H.264/MPEG-4 AVC codec, our scheme outperforms maximally 1.5 dB when encoding some typical multiview image sequences. We also carry out an experiment to evaluate the robustness of our algorithm. The results indicate our method is robust and can be used in practical applications.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2007.905382