Tensor non-local low-rank regularization for recovering compressed hyperspectral images
Sparsity-based methods have been widely used in hyperspectral imagery compression recovery (HSI-CR). However, most of the available HSI-CR methods work on vector space by vectorizing hyperspectral cubes in spatial and spectral domain, which will destroy spatial and spectral correlation and result in...
Gespeichert in:
| Veröffentlicht in: | 2017 IEEE International Conference on Image Processing (ICIP) S. 3046 - 3050 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Tagungsbericht |
| Sprache: | Englisch |
| Veröffentlicht: |
IEEE
01.09.2017
|
| Schlagworte: | |
| ISSN: | 2381-8549 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | Sparsity-based methods have been widely used in hyperspectral imagery compression recovery (HSI-CR). However, most of the available HSI-CR methods work on vector space by vectorizing hyperspectral cubes in spatial and spectral domain, which will destroy spatial and spectral correlation and result in spatial and spectral information distortion in the recovery. At the same time, vectorization also make HSI's intrinsic structure sparsity cannot be utilized adequately. In this paper, a tensor non-local low-rank regularization (TNLR) approach is proposed to exploit essential structured sparsity and explore its advantages for CR of hyperspectral imagery. Specifically, a tensor nuclear norm penalty function is utilized as tensor low-rank regularization term to describe the spatial-and-spectral correlation hidden in HSI. To further improve the computational efficiency of the proposed algorithm, a fast implementation algorithm is developed by using the alternative direction multiplier method (ADMM) technique. Experimental results are shown that the proposed TNLR-CR algorithm can significantly outperform existing state-of-the-art CR techniques for hyperspectral image recovery. |
|---|---|
| ISSN: | 2381-8549 |
| DOI: | 10.1109/ICIP.2017.8296842 |