Method for speeding up spatial error concealment using prediction mode of the neighboring blocks on H.264 video communication

Error concealment can recover the video frame which has been corrupted by packet loss over error-prone channel, however, speeding up of error concealment is very important for various real-time applications such as video conferencing, video chatting, etc. A fast spatial error concealment method usin...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Multimedia tools and applications Ročník 82; číslo 9; s. 13733 - 13743
Hlavní autoři: Hwang, Pyong-Su, Ri, Ju-Hyok, Yun, Yong-Hun
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.04.2023
Springer Nature B.V
Témata:
ISSN:1380-7501, 1573-7721
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Error concealment can recover the video frame which has been corrupted by packet loss over error-prone channel, however, speeding up of error concealment is very important for various real-time applications such as video conferencing, video chatting, etc. A fast spatial error concealment method using prediction mode of the neighboring blocks is presented in this paper. First, the weighting values of edge prediction direction for sixteen 4 × 4 neighboring blocks are calculated considering prediction mode of the opposite neighboring blocks. Second, the significant edges within a corrupted macroblock(MB) are estimated using the sixteen weighting values of edge prediction direction. Finally, the approximations for each corrupted pixel are calculated along each significant edge, then a weighted average of multiple approximations is computed considering prediction mode of the neighboring blocks. Experimental results show that the proposed algorithm speeds up multi-directional interpolation up to 1.17 times while sacrificing image quality for about 0.01 dB on avaerage compared with the previous method.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-13950-5