Remote sensing image compression based on double-sparsity dictionary learning and universal trellis coded quantization

In this paper, we propose a novel remote sensing image compression method based on double-sparsity dictionary learning and universal trellis coded quantization (UTCQ). Recent years have seen a growing interest in the study of natural image compression based on sparse representation and dictionary le...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings - International Conference on Image Processing S. 1665 - 1669
Hauptverfasser: Zhan, Xin, Zhang, Rong, Yin, Dong, Hu, Anzhou, Hu, Wenlong
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 01.09.2013
Schlagworte:
ISSN:1522-4880
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose a novel remote sensing image compression method based on double-sparsity dictionary learning and universal trellis coded quantization (UTCQ). Recent years have seen a growing interest in the study of natural image compression based on sparse representation and dictionary learning. We show that using the double-sparsity model to learn a dictionary gives much better compression results for remote sensing images, the texture of which is much richer than that of natural images. We also show that the compression performance is improved significantly when advanced quantization and entropy coding strategies are used for encoding the sparse representation coefficients. The proposed method outperforms the existing dictionary-based image coding algorithms. Additionally, our method results in better ratedistortion performance and structural similarity results than CCSDS and JPEG2000 standard.
ISSN:1522-4880
DOI:10.1109/ICIP.2013.6738343