Deep learning image compression with multi-channel tANS coding and hardware deployment

Deep learning-based image compression outperforms traditional methods in coding efficiency, but its computational complexity hinders real-time deployment on embedded devices. This paper proposes a heterogeneous computing system combining GPU-accelerated inference and CPU-accelerated entropy coding v...

Full description

Saved in:
Bibliographic Details
Published in:Journal of real-time image processing Vol. 23; no. 1; p. 1
Main Authors: Zhu, Yaohua, Zhang, Yong, Liu, Ya, Jiang, Jingyu, Zhu, Yanghang, Huang, Mingsheng
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.01.2026
Springer Nature B.V
Subjects:
ISSN:1861-8200, 1861-8219
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep learning-based image compression outperforms traditional methods in coding efficiency, but its computational complexity hinders real-time deployment on embedded devices. This paper proposes a heterogeneous computing system combining GPU-accelerated inference and CPU-accelerated entropy coding via lookup tables, breaking performance bottlenecks through algorithm-hardware co-design. After GPU acceleration, entropy coding becomes the dominant bottleneck (73% of runtime). To address this, we introduce three key innovations: replacing rANS with tANS encoding, converting dynamic computations into static table lookups, reducing encoding latency; a cache-friendly tANS coding scheme for the 192-channel network outputs, minimizing access latency; an out-of-range symbol encoding method, ensuring lossless and efficient compression. Experiments demonstrate that under high compression ratios, compared with traditional rANS, tANS reduces latency by 77%, with a compression ratio loss of 12.6% while still ensuring image compression quality higher than JPEG2000.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1861-8200
1861-8219
DOI:10.1007/s11554-025-01795-8