Hyperspectral Data Compression Using Fully Convolutional Autoencoder

In space science and satellite imagery, better resolution of the data information obtained makes images clearer and interpretation more accurate. However, the huge data volume gained by the complex on-board satellite instruments becomes a problem that needs to be managed carefully. To reduce the dat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Jg. 14; H. 10; S. 2472
Hauptverfasser: La Grassa, Riccardo, Re, Cristina, Cremonese, Gabriele, Gallo, Ignazio
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Basel MDPI AG 01.05.2022
Schlagworte:
ISSN:2072-4292, 2072-4292
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In space science and satellite imagery, better resolution of the data information obtained makes images clearer and interpretation more accurate. However, the huge data volume gained by the complex on-board satellite instruments becomes a problem that needs to be managed carefully. To reduce the data volume to be stored and transmitted on-ground, the signals received should be compressed, allowing a good original source representation in the reconstruction step. Image compression covers a key role in space science and satellite imagery and, recently, deep learning models have achieved remarkable results in computer vision. In this paper, we propose a spectral signals compressor network based on deep convolutional autoencoder (SSCNet) and we conduct experiments over multi/hyperspectral and RGB datasets reporting improvements over all baselines used as benchmarks and than the JPEG family algorithm. Experimental results demonstrate the effectiveness in the compression ratio and spectral signal reconstruction and the robustness with a data type greater than 8 bits, clearly exhibiting better results using the PSNR, SSIM, and MS-SSIM evaluation criteria.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14102472