Class-Specific Autoaugment Architecture Based on Schmidt Mathematical Theory for Imbalanced Hyperspectral Classification

Hyperspectral image classification (HSIC) often suffers from severe imbalanced category distribution in real applications, which causes bias toward the dominated categories. As an effective method, the deep generative model (DGM) can be used to augment the features of imbalanced data through a learn...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on geoscience and remote sensing Ročník 61; s. 1 - 15
Hlavní autori: Li, Jiaojiao, Diao, Yan, Song, Rui, Xi, Bobo, Li, Yunsong, Du, Qian
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:0196-2892, 1558-0644
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Hyperspectral image classification (HSIC) often suffers from severe imbalanced category distribution in real applications, which causes bias toward the dominated categories. As an effective method, the deep generative model (DGM) can be used to augment the features of imbalanced data through a learnable method to achieve superior classification performance. However, the features extracted by DGM are preset as a standard Gaussian distribution which results in low interclass difference. Besides, the generated features are too consistent with the original ones, which cannot play a positive role in the discriminability of minority categories (MCs). To conquer these drawbacks, we propose a class-specific autoaugment architecture based on Schmidt mathematical theory (CACS) for the challenging of imbalanced data which consists of two stages: one is training a superior features extractor, and the other one is augmenting features. The class-specific features of the whole HSI are extracted in stage one that supports the following feature augmented module. Specifically, we weighted the classifier in the first phase according to cost-sensitive learning, to prevent the classifier from overfitting. To expand the dispersion between categories, we construct feature prototypes obeying different Gaussian distributions for each class, respectively, and generate class-specific features. Then, the features are augmented in the second phase based on Schmidt's mathematical theory, which enhances the discriminability of minority class features, thus further improving the classification accuracy with interpretability. Extensive experimental results on three benchmarking datasets demonstrate that CACS is outstanding in comparison algorithms, especially in MCs.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3317885