Hybrid Probabilistic Sparse Coding With Spatial Neighbor Tensor for Hyperspectral Imagery Classification

Under the umbrella of tensor algebra, this paper proposes a new sparse-coding-based classifier (SCC) for hyperspectral imagery classification (HIC). By utilizing the tensor forms of hyperspectral pixels, we advance a tensor sparse-coding model which preserves as many original spatial constraints of...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on geoscience and remote sensing Ročník 56; číslo 5; s. 2491 - 2502
Hlavní autori: Yang, Lixia, Wang, Min, Yang, Shuyuan, Zhao, Hui, Jiao, Licheng, Feng, Xiangchu
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 01.05.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:0196-2892, 1558-0644
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Under the umbrella of tensor algebra, this paper proposes a new sparse-coding-based classifier (SCC) for hyperspectral imagery classification (HIC). By utilizing the tensor forms of hyperspectral pixels, we advance a tensor sparse-coding model which preserves as many original spatial constraints of a pixel and its spatial neighbors as possible. Furthermore, to alleviate the classification uncertainty resulted from widely existing mixed pixels, this paper constructs a regularization term for maximizing the likelihood of sparse-coding tensor defined on the posterior class probability. By combining the tensor sparse coding with maximizing likelihood estimation, a hybrid probabilistic SCC with spatial neighbor tensor (HPSCC-SNT) is proposed, which makes the pixels be well represented by the training pixels belonging to the same class. The performance of HPSCC-SNT is evaluated on three real hyperspectral imagery data sets, and the results show that it can achieve accurate and robust HIC results, and outperforms the state-of-the-art methods.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2017.2732480