Class-Specific Autoaugment Architecture Based on Schmidt Mathematical Theory for Imbalanced Hyperspectral Classification

Hyperspectral image classification (HSIC) often suffers from severe imbalanced category distribution in real applications, which causes bias toward the dominated categories. As an effective method, the deep generative model (DGM) can be used to augment the features of imbalanced data through a learn...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing Vol. 61; pp. 1 - 15
Main Authors: Li, Jiaojiao, Diao, Yan, Song, Rui, Xi, Bobo, Li, Yunsong, Du, Qian
Format: Journal Article
Language:English
Published: New York IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:0196-2892, 1558-0644
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Hyperspectral image classification (HSIC) often suffers from severe imbalanced category distribution in real applications, which causes bias toward the dominated categories. As an effective method, the deep generative model (DGM) can be used to augment the features of imbalanced data through a learnable method to achieve superior classification performance. However, the features extracted by DGM are preset as a standard Gaussian distribution which results in low interclass difference. Besides, the generated features are too consistent with the original ones, which cannot play a positive role in the discriminability of minority categories (MCs). To conquer these drawbacks, we propose a class-specific autoaugment architecture based on Schmidt mathematical theory (CACS) for the challenging of imbalanced data which consists of two stages: one is training a superior features extractor, and the other one is augmenting features. The class-specific features of the whole HSI are extracted in stage one that supports the following feature augmented module. Specifically, we weighted the classifier in the first phase according to cost-sensitive learning, to prevent the classifier from overfitting. To expand the dispersion between categories, we construct feature prototypes obeying different Gaussian distributions for each class, respectively, and generate class-specific features. Then, the features are augmented in the second phase based on Schmidt's mathematical theory, which enhances the discriminability of minority class features, thus further improving the classification accuracy with interpretability. Extensive experimental results on three benchmarking datasets demonstrate that CACS is outstanding in comparison algorithms, especially in MCs.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3317885