DHCAE: Deep Hybrid Convolutional Autoencoder Approach for Robust Supervised Hyperspectral Unmixing

Hyperspectral unmixing (HSU) is a crucial method to determine the fractional abundance of the material (endmembers) in each pixel. Most spectral unmixing methods are affected by low signal-to-noise ratios because of noisy pixels and bands simultaneously, requiring robust HSU techniques that exploit...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Remote sensing (Basel, Switzerland) Ročník 14; číslo 18; s. 4433
Hlavní autoři: Hadi, Fazal, Yang, Jingxiang, Ullah, Matee, Ahmad, Irfan, Farooque, Ghulam, Xiao, Liang
Médium: Journal Article
Jazyk:angličtina
Vydáno: Basel MDPI AG 01.09.2022
Témata:
ISSN:2072-4292, 2072-4292
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Hyperspectral unmixing (HSU) is a crucial method to determine the fractional abundance of the material (endmembers) in each pixel. Most spectral unmixing methods are affected by low signal-to-noise ratios because of noisy pixels and bands simultaneously, requiring robust HSU techniques that exploit both 3D (spectral–spatial dimension) and 2D (spatial dimension) domains. In this paper, we present a new method for robust supervised HSU based on a deep hybrid (3D and 2D) convolutional autoencoder (DHCAE) network. Most HSU methods adopt the 2D model for simplicity, whereas the performance of HSU depends on spectral and spatial information. The DHCAE network exploits spectral and spatial information of the remote sensing images for abundance map estimation. In addition, DHCAE uses dropout to regularize the network for smooth learning and to avoid overfitting. Quantitative and qualitative results confirm that our proposed DHCAE network achieved better hyperspectral unmixing performance on synthetic and three real hyperspectral images, i.e., Jasper Ridge, urban and Washington DC Mall datasets.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14184433