Remote sensing image compression based on double-sparsity dictionary learning and universal trellis coded quantization

In this paper, we propose a novel remote sensing image compression method based on double-sparsity dictionary learning and universal trellis coded quantization (UTCQ). Recent years have seen a growing interest in the study of natural image compression based on sparse representation and dictionary le...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Proceedings - International Conference on Image Processing s. 1665 - 1669
Hlavní autori: Zhan, Xin, Zhang, Rong, Yin, Dong, Hu, Anzhou, Hu, Wenlong
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 01.09.2013
Predmet:
ISSN:1522-4880
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:In this paper, we propose a novel remote sensing image compression method based on double-sparsity dictionary learning and universal trellis coded quantization (UTCQ). Recent years have seen a growing interest in the study of natural image compression based on sparse representation and dictionary learning. We show that using the double-sparsity model to learn a dictionary gives much better compression results for remote sensing images, the texture of which is much richer than that of natural images. We also show that the compression performance is improved significantly when advanced quantization and entropy coding strategies are used for encoding the sparse representation coefficients. The proposed method outperforms the existing dictionary-based image coding algorithms. Additionally, our method results in better ratedistortion performance and structural similarity results than CCSDS and JPEG2000 standard.
ISSN:1522-4880
DOI:10.1109/ICIP.2013.6738343