Remote sensing image compression based on double-sparsity dictionary learning and universal trellis coded quantization
In this paper, we propose a novel remote sensing image compression method based on double-sparsity dictionary learning and universal trellis coded quantization (UTCQ). Recent years have seen a growing interest in the study of natural image compression based on sparse representation and dictionary le...
Uloženo v:
| Vydáno v: | Proceedings - International Conference on Image Processing s. 1665 - 1669 |
|---|---|
| Hlavní autoři: | , , , , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
01.09.2013
|
| Témata: | |
| ISSN: | 1522-4880 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | In this paper, we propose a novel remote sensing image compression method based on double-sparsity dictionary learning and universal trellis coded quantization (UTCQ). Recent years have seen a growing interest in the study of natural image compression based on sparse representation and dictionary learning. We show that using the double-sparsity model to learn a dictionary gives much better compression results for remote sensing images, the texture of which is much richer than that of natural images. We also show that the compression performance is improved significantly when advanced quantization and entropy coding strategies are used for encoding the sparse representation coefficients. The proposed method outperforms the existing dictionary-based image coding algorithms. Additionally, our method results in better ratedistortion performance and structural similarity results than CCSDS and JPEG2000 standard. |
|---|---|
| ISSN: | 1522-4880 |
| DOI: | 10.1109/ICIP.2013.6738343 |