Weighted Feature Fusion of Convolutional Neural Network and Graph Attention Network for Hyperspectral Image Classification

Convolutional Neural Networks (CNN) and Graph Neural Networks (GNN), such as Graph Attention Networks (GAT), are two classic neural network models, which are applied to the processing of grid data and graph data respectively. They have achieved outstanding performance in hyperspectral images (HSIs)...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on image processing Vol. 31; pp. 1559 - 1572
Main Authors: Dong, Yanni, Liu, Quanwei, Du, Bo, Zhang, Liangpei
Format: Journal Article
Language:English
Published: United States IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1057-7149, 1941-0042, 1941-0042
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Convolutional Neural Networks (CNN) and Graph Neural Networks (GNN), such as Graph Attention Networks (GAT), are two classic neural network models, which are applied to the processing of grid data and graph data respectively. They have achieved outstanding performance in hyperspectral images (HSIs) classification field, which have attracted great interest. However, CNN has been facing the problem of small samples and GNN has to pay a huge computational cost, which restrict the performance of the two models. In this paper, we propose Weighted Feature Fusion of Convolutional Neural Network and Graph Attention Network (WFCG) for HSI classification, by using the characteristics of superpixel-based GAT and pixel-based CNN, which proved to be complementary. We first establish GAT with the help of superpixel-based encoder and decoder modules. Then we combined the attention mechanism to construct CNN. Finally, the features are weighted fusion with the characteristics of two neural network models. Rigorous experiments on three real-world HSI data sets show WFCG can fully explore the high-dimensional feature of HSI, and obtain competitive results compared to other state-of-the art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2022.3144017