Adaptive Context Modeling for Arithmetic Coding Using Perceptrons

Arithmetic coding is used in most media compression methods. Context modeling is usually done through frequency counting and look-up tables (LUTs). For long-memory signals, probability modeling with large context sizes is often infeasible. Recently, neural networks have been used to model probabilit...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE signal processing letters Ročník 29; s. 1 - 5
Hlavní autoři: Lopes, Lucas S., Chou, Philip A., de Queiroz, Ricardo L.
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1070-9908, 1558-2361
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Arithmetic coding is used in most media compression methods. Context modeling is usually done through frequency counting and look-up tables (LUTs). For long-memory signals, probability modeling with large context sizes is often infeasible. Recently, neural networks have been used to model probabilities of large contexts in order to drive arithmetic coders. These neural networks have been trained offline. We introduce an online method for training a perceptron-based context-adaptive arithmetic coder on-the-fly, called adaptive perceptron coding , which continuously learns the context probabilities and quickly converges to the signal statistics. We test adaptive perceptron coding over a binary image database, with results always exceeding the performance of LUT-based methods for large context sizes and of recurrent neural networks. We also compare the method to a version requiring offline training, which leads to equally satisfactory results.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2022.3223314