CNN-Enhanced Graph Convolutional Network With Pixel- and Superpixel-Level Feature Fusion for Hyperspectral Image Classification
Recently, the graph convolutional network (GCN) has drawn increasing attention in the hyperspectral image (HSI) classification. Compared with the convolutional neural network (CNN) with fixed square kernels, GCN can explicitly utilize the correlation between adjacent land covers and conduct flexible...
Saved in:
| Published in: | IEEE transactions on geoscience and remote sensing Vol. 59; no. 10; pp. 8657 - 8671 |
|---|---|
| Main Authors: | , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York
IEEE
01.10.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 0196-2892, 1558-0644 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Recently, the graph convolutional network (GCN) has drawn increasing attention in the hyperspectral image (HSI) classification. Compared with the convolutional neural network (CNN) with fixed square kernels, GCN can explicitly utilize the correlation between adjacent land covers and conduct flexible convolution on arbitrarily irregular image regions; hence, the HSI spatial contextual structure can be better modeled. However, to reduce the computational complexity and promote the semantic structure learning of land covers, GCN usually works on superpixel-based nodes rather than pixel-based nodes; thus, the pixel-level spectral-spatial features cannot be captured. To fully leverage the advantages of the CNN and GCN, we propose a heterogeneous deep network called CNN-enhanced GCN (CEGCN), in which CNN and GCN branches perform feature learning on small-scale regular regions and large-scale irregular regions, and generate complementary spectral-spatial features at pixel and superpixel levels, respectively. To alleviate the structural incompatibility of the data representation between the Euclidean data-oriented CNN and non-Euclidean data-oriented GCN, we propose the graph encoder and decoder to propagate features between image pixels and graph nodes, thus enabling the CNN and GCN to collaborate in a single network. In contrast to other GCN-based methods that encode HSI into a graph during preprocessing, we integrate the graph encoding process into the network and learn edge weights from training data, which can promote the node feature learning and make the graph more adaptive to HSI content. Extensive experiments on three data sets demonstrate that the proposed CEGCN is both qualitatively and quantitatively competitive compared with other state-of-the-art methods. |
|---|---|
| AbstractList | Recently, the graph convolutional network (GCN) has drawn increasing attention in the hyperspectral image (HSI) classification. Compared with the convolutional neural network (CNN) with fixed square kernels, GCN can explicitly utilize the correlation between adjacent land covers and conduct flexible convolution on arbitrarily irregular image regions; hence, the HSI spatial contextual structure can be better modeled. However, to reduce the computational complexity and promote the semantic structure learning of land covers, GCN usually works on superpixel-based nodes rather than pixel-based nodes; thus, the pixel-level spectral–spatial features cannot be captured. To fully leverage the advantages of the CNN and GCN, we propose a heterogeneous deep network called CNN-enhanced GCN (CEGCN), in which CNN and GCN branches perform feature learning on small-scale regular regions and large-scale irregular regions, and generate complementary spectral–spatial features at pixel and superpixel levels, respectively. To alleviate the structural incompatibility of the data representation between the Euclidean data-oriented CNN and non-Euclidean data-oriented GCN, we propose the graph encoder and decoder to propagate features between image pixels and graph nodes, thus enabling the CNN and GCN to collaborate in a single network. In contrast to other GCN-based methods that encode HSI into a graph during preprocessing, we integrate the graph encoding process into the network and learn edge weights from training data, which can promote the node feature learning and make the graph more adaptive to HSI content. Extensive experiments on three data sets demonstrate that the proposed CEGCN is both qualitatively and quantitatively competitive compared with other state-of-the-art methods. |
| Author | Xiao, Liang Yang, Jingxiang Liu, Qichao Wei, Zhihui |
| Author_xml | – sequence: 1 givenname: Qichao orcidid: 0000-0003-0134-9450 surname: Liu fullname: Liu, Qichao email: qc.l@njust.edu.cn organization: School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing, China – sequence: 2 givenname: Liang orcidid: 0000-0003-0178-9384 surname: Xiao fullname: Xiao, Liang email: xiaoliang@mail.njust.edu.cn organization: School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing, China – sequence: 3 givenname: Jingxiang orcidid: 0000-0002-1234-0614 surname: Yang fullname: Yang, Jingxiang email: yang123jx@njust.edu.cn organization: School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing, China – sequence: 4 givenname: Zhihui orcidid: 0000-0002-4841-6051 surname: Wei fullname: Wei, Zhihui email: gswei@njust.edu.cn organization: School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing, China |
| BookMark | eNp9kMtOwzAQRS0EEqXwAYiNJdYpfsRxskQRLUhVQTzEMjLOmBpCHOyEx4pfx20RCxasRjO6Z2bu3UPbrWsBoUNKJpSS4uR2dn0zYYSRCSdc8oxuoREVIk9IlqbbaERokSUsL9gu2gvhiRCaCipH6KtcLJKzdqlaDTWeedUtcenaN9cMvXWtavAC-nfnn_G97Zf4yn5Ak2DV1vhm6MB3634Ob9DgKah-8ICnQ4gkNs7j88-oCR3o3sdNFy_qEXDZqBCssVqtDuyjHaOaAAc_dYzupme35Xkyv5xdlKfzRLOC98lDpkUtGeSUK6ipMlIIUXNiGKEP3DCpmZa6gOg2EznVcSBMmhmgplYECB-j483ezrvXAUJfPbnBR3-hYkKKIidCplElNyrtXQgeTKVtv_4zGrBNRUm1SrtapV2t0q5-0o4k_UN23r4o__kvc7RhLAD86guW5aks-DceuI6L |
| CODEN | IGRSD2 |
| CitedBy_id | crossref_primary_10_1007_s11517_025_03303_3 crossref_primary_10_1016_j_dsp_2024_104965 crossref_primary_10_1109_TGRS_2024_3516772 crossref_primary_10_3390_rs15071803 crossref_primary_10_1109_LGRS_2024_3379232 crossref_primary_10_1109_JSTARS_2021_3100400 crossref_primary_10_1016_j_inffus_2025_103285 crossref_primary_10_1016_j_jag_2022_103005 crossref_primary_10_1109_JSTARS_2025_3548571 crossref_primary_10_1109_TGRS_2023_3294227 crossref_primary_10_1109_TGRS_2025_3598375 crossref_primary_10_1109_JSTARS_2023_3241157 crossref_primary_10_3390_jmse13030439 crossref_primary_10_1016_j_neucom_2023_126487 crossref_primary_10_1109_LGRS_2024_3401133 crossref_primary_10_1109_JSTARS_2022_3187009 crossref_primary_10_1109_JSTARS_2024_3468949 crossref_primary_10_1109_TMM_2025_3535398 crossref_primary_10_1109_TGRS_2023_3327109 crossref_primary_10_1109_TGRS_2025_3608942 crossref_primary_10_1038_s41598_025_90228_4 crossref_primary_10_1109_TGRS_2022_3203488 crossref_primary_10_3390_rs16101747 crossref_primary_10_1109_JSTARS_2024_3358298 crossref_primary_10_1016_j_eswa_2024_125106 crossref_primary_10_1109_TGRS_2024_3415965 crossref_primary_10_3390_rs15174235 crossref_primary_10_1109_LGRS_2023_3299281 crossref_primary_10_1109_TIP_2023_3287738 crossref_primary_10_1109_JSTARS_2025_3580751 crossref_primary_10_1109_TGRS_2023_3253248 crossref_primary_10_1109_JSTARS_2024_3450856 crossref_primary_10_3390_drones8040142 crossref_primary_10_1109_TIP_2024_3422881 crossref_primary_10_1109_TGRS_2025_3566672 crossref_primary_10_1109_TGRS_2023_3307609 crossref_primary_10_3390_ijerph191710707 crossref_primary_10_1016_j_neucom_2024_128271 crossref_primary_10_1109_JSTARS_2024_3486283 crossref_primary_10_1109_TGRS_2023_3307604 crossref_primary_10_1109_LGRS_2023_3348093 crossref_primary_10_1016_j_compag_2024_109838 crossref_primary_10_1109_JSTARS_2025_3580769 crossref_primary_10_3390_electronics13163271 crossref_primary_10_1109_TGRS_2023_3292142 crossref_primary_10_1109_TGRS_2024_3350573 crossref_primary_10_1109_LGRS_2023_3316732 crossref_primary_10_1109_TGRS_2024_3505539 crossref_primary_10_1109_TASE_2025_3541339 crossref_primary_10_1109_TGRS_2024_3357524 crossref_primary_10_1109_TGRS_2025_3576200 crossref_primary_10_1109_LGRS_2022_3167535 crossref_primary_10_1364_OL_573886 crossref_primary_10_1109_TGRS_2023_3314616 crossref_primary_10_1109_TGRS_2024_3409378 crossref_primary_10_3390_rs16050895 crossref_primary_10_3390_s23135795 crossref_primary_10_1109_TGRS_2023_3339247 crossref_primary_10_1109_JSTARS_2025_3538759 crossref_primary_10_1142_S0219467825500196 crossref_primary_10_3390_rs16142636 crossref_primary_10_3390_ijgi12120505 crossref_primary_10_3390_rs16193741 crossref_primary_10_1109_TCSVT_2024_3379291 crossref_primary_10_1080_01431161_2024_2398822 crossref_primary_10_1109_JSTARS_2023_3339238 crossref_primary_10_3390_math12132020 crossref_primary_10_1109_TGRS_2023_3333005 crossref_primary_10_1109_TMI_2024_3501365 crossref_primary_10_1109_JSTARS_2021_3121334 crossref_primary_10_1007_s12524_025_02275_z crossref_primary_10_1109_TGRS_2024_3442784 crossref_primary_10_1109_TNNLS_2022_3212985 crossref_primary_10_1109_JSTARS_2022_3195639 crossref_primary_10_1109_TGRS_2024_3419790 crossref_primary_10_1109_TGRS_2023_3280183 crossref_primary_10_3390_foods14142488 crossref_primary_10_1016_j_asoc_2025_112723 crossref_primary_10_1109_TGRS_2022_3212418 crossref_primary_10_1109_TGRS_2024_3390575 crossref_primary_10_1109_LGRS_2022_3192832 crossref_primary_10_1016_j_eswa_2025_127046 crossref_primary_10_1063_5_0222796 crossref_primary_10_1109_TGRS_2025_3584561 crossref_primary_10_1109_TGRS_2021_3135084 crossref_primary_10_1109_TGRS_2024_3431680 crossref_primary_10_1109_TNNLS_2022_3158280 crossref_primary_10_1109_TGRS_2023_3304311 crossref_primary_10_1109_LGRS_2021_3069987 crossref_primary_10_1080_17538947_2024_2353110 crossref_primary_10_3390_rs15235483 crossref_primary_10_1109_TGRS_2023_3349069 crossref_primary_10_1088_1742_6596_3055_1_012034 crossref_primary_10_1109_TGRS_2023_3239411 crossref_primary_10_1109_TCSVT_2025_3525734 crossref_primary_10_3390_rs16162942 crossref_primary_10_1109_TGRS_2024_3467088 crossref_primary_10_1109_TGRS_2024_3412131 crossref_primary_10_1111_phor_12493 crossref_primary_10_1016_j_eswa_2025_128386 crossref_primary_10_1016_j_neunet_2023_08_059 crossref_primary_10_3390_rs15041001 crossref_primary_10_1016_j_infrared_2024_105425 crossref_primary_10_1109_TGRS_2023_3270204 crossref_primary_10_3390_rs17081422 crossref_primary_10_1109_TGRS_2022_3163326 crossref_primary_10_1109_JSTARS_2024_3392448 crossref_primary_10_1109_TIM_2023_3271713 crossref_primary_10_3390_rs16050844 crossref_primary_10_1109_TGRS_2023_3257369 crossref_primary_10_1109_TGRS_2024_3386256 crossref_primary_10_1109_TGRS_2024_3440271 crossref_primary_10_1109_TGRS_2024_3492387 crossref_primary_10_1109_TASE_2025_3557234 crossref_primary_10_1109_TNNLS_2023_3345734 crossref_primary_10_1109_TCSVT_2022_3227172 crossref_primary_10_1080_01431161_2024_2384098 crossref_primary_10_3390_rs15153900 crossref_primary_10_1016_j_eswa_2024_126026 crossref_primary_10_1109_TCSVT_2023_3343881 crossref_primary_10_1109_TGRS_2024_3356510 crossref_primary_10_1016_j_cja_2025_103740 crossref_primary_10_1016_j_patcog_2024_110959 crossref_primary_10_1109_JSTARS_2024_3355290 crossref_primary_10_1016_j_dsp_2024_104392 crossref_primary_10_1109_ACCESS_2025_3569874 crossref_primary_10_1109_ACCESS_2025_3543122 crossref_primary_10_3390_rs16173155 crossref_primary_10_1109_TGRS_2024_3511614 crossref_primary_10_1109_JSTARS_2024_3403863 crossref_primary_10_1109_ACCESS_2022_3164691 crossref_primary_10_3390_rs17081461 crossref_primary_10_1016_j_jer_2024_08_006 crossref_primary_10_1109_TGRS_2021_3075546 crossref_primary_10_1109_TGRS_2024_3476327 crossref_primary_10_1016_j_infrared_2024_105401 crossref_primary_10_3390_rs15174255 crossref_primary_10_1016_j_jag_2023_103614 crossref_primary_10_1109_TGRS_2022_3160513 crossref_primary_10_3390_fi16050161 crossref_primary_10_1109_LGRS_2024_3398439 crossref_primary_10_1109_TGRS_2024_3427769 crossref_primary_10_1109_TGRS_2023_3240721 crossref_primary_10_1016_j_jpdc_2025_105140 crossref_primary_10_1109_ACCESS_2022_3220776 crossref_primary_10_1109_JSTARS_2023_3308037 crossref_primary_10_1109_TGRS_2024_3386718 crossref_primary_10_1109_JSTARS_2024_3355071 crossref_primary_10_1016_j_engappai_2024_108669 crossref_primary_10_1016_j_jag_2023_103485 crossref_primary_10_1109_LGRS_2023_3316796 crossref_primary_10_1109_TAES_2024_3388373 crossref_primary_10_1016_j_eswa_2025_128063 crossref_primary_10_1109_TGRS_2024_3487296 crossref_primary_10_1109_TGRS_2022_3220409 crossref_primary_10_3390_app12136797 crossref_primary_10_1007_s11227_025_07445_x crossref_primary_10_1109_TIP_2023_3270104 crossref_primary_10_1080_07038992_2024_2333424 crossref_primary_10_1109_JSTARS_2024_3454150 crossref_primary_10_1109_TGRS_2023_3242990 crossref_primary_10_1016_j_jag_2024_103897 crossref_primary_10_1007_s11004_024_10158_1 crossref_primary_10_3390_rs14092243 crossref_primary_10_1109_JSTARS_2024_3367626 crossref_primary_10_1109_TGRS_2022_3165025 crossref_primary_10_1109_JSTARS_2025_3575207 crossref_primary_10_1109_TGRS_2023_3310521 crossref_primary_10_1007_s00521_023_08935_w crossref_primary_10_1109_TGRS_2024_3408330 crossref_primary_10_1109_TIM_2023_3284941 crossref_primary_10_3390_rs16112029 crossref_primary_10_1016_j_engappai_2023_107070 crossref_primary_10_3390_rs14030681 crossref_primary_10_1109_TGRS_2024_3387420 crossref_primary_10_1109_TGRS_2023_3325459 crossref_primary_10_1109_TGRS_2022_3150349 crossref_primary_10_1007_s00530_022_00993_9 crossref_primary_10_1109_TGRS_2023_3264653 crossref_primary_10_1016_j_ins_2024_121504 crossref_primary_10_1109_TGRS_2022_3162100 crossref_primary_10_1109_TGRS_2025_3592406 crossref_primary_10_1007_s10462_025_11169_y crossref_primary_10_1109_JSEN_2025_3592786 crossref_primary_10_1109_LGRS_2022_3178708 crossref_primary_10_1109_TGRS_2023_3322641 crossref_primary_10_1109_TGRS_2025_3566116 crossref_primary_10_1109_JSTARS_2023_3321776 crossref_primary_10_1109_TGRS_2021_3112586 crossref_primary_10_1109_JSTARS_2025_3584970 crossref_primary_10_1109_TGRS_2025_3548607 crossref_primary_10_1109_TAI_2022_3194869 crossref_primary_10_3390_app13127143 crossref_primary_10_1016_j_ins_2025_122323 crossref_primary_10_1109_TGRS_2023_3265879 crossref_primary_10_1080_22797254_2023_2271651 crossref_primary_10_1109_TCSVT_2023_3339135 crossref_primary_10_1080_01431161_2024_2408495 crossref_primary_10_1109_JBHI_2024_3495835 crossref_primary_10_1109_TGRS_2024_3439434 crossref_primary_10_3390_rs13214348 crossref_primary_10_1080_10408347_2023_2207652 crossref_primary_10_1109_TGRS_2024_3388429 crossref_primary_10_1109_TGRS_2022_3225438 crossref_primary_10_1109_TGRS_2022_3190466 crossref_primary_10_1109_TGRS_2024_3441617 crossref_primary_10_3390_rs15174170 crossref_primary_10_1080_01431161_2023_2247523 crossref_primary_10_1109_JSTARS_2025_3604753 crossref_primary_10_3390_rs14195027 crossref_primary_10_1109_JSTARS_2024_3479920 crossref_primary_10_3390_rs16173184 crossref_primary_10_1109_TGRS_2022_3184117 crossref_primary_10_1109_TGRS_2023_3327418 crossref_primary_10_1109_TGRS_2023_3304716 crossref_primary_10_1109_TGRS_2025_3572128 crossref_primary_10_3390_rs14205118 crossref_primary_10_1109_TGRS_2024_3479220 crossref_primary_10_1109_TNNLS_2023_3265560 crossref_primary_10_1109_TGRS_2025_3566616 crossref_primary_10_1109_JSTARS_2021_3135548 crossref_primary_10_1080_15592324_2023_2214765 crossref_primary_10_1007_s11082_023_06101_z crossref_primary_10_1109_TGRS_2024_3390928 crossref_primary_10_1109_LGRS_2025_3583697 crossref_primary_10_3390_electronics12194020 crossref_primary_10_3390_s25103092 crossref_primary_10_1016_j_asoc_2023_111176 crossref_primary_10_1109_TGRS_2022_3230378 crossref_primary_10_3390_rs16214001 crossref_primary_10_1109_JSTARS_2024_3455561 crossref_primary_10_1109_LGRS_2024_3449238 crossref_primary_10_1007_s40747_022_00836_0 crossref_primary_10_1038_s40494_025_01716_9 crossref_primary_10_1109_TGRS_2024_3471616 crossref_primary_10_1016_j_eswa_2023_122202 crossref_primary_10_1109_TGRS_2022_3221534 crossref_primary_10_1109_TGRS_2025_3562261 crossref_primary_10_1016_j_eswa_2024_125672 crossref_primary_10_1109_TGRS_2023_3326231 crossref_primary_10_1016_j_eswa_2024_125665 crossref_primary_10_1117_1_JRS_18_036507 crossref_primary_10_1109_TGRS_2023_3331244 crossref_primary_10_1109_TGRS_2024_3372497 crossref_primary_10_1109_ACCESS_2025_3595997 crossref_primary_10_3390_rs16132328 crossref_primary_10_1109_TGRS_2024_3362471 crossref_primary_10_1080_14498596_2024_2350585 crossref_primary_10_1109_TGRS_2025_3553524 crossref_primary_10_1109_TGRS_2023_3325818 crossref_primary_10_1109_TGRS_2023_3343109 crossref_primary_10_3390_rs15164078 crossref_primary_10_1016_j_eswa_2025_129522 crossref_primary_10_1109_JSTARS_2021_3132394 crossref_primary_10_1109_TGRS_2023_3265388 crossref_primary_10_1109_TIP_2022_3144017 crossref_primary_10_1109_TGRS_2023_3298848 crossref_primary_10_1007_s11053_024_10413_6 crossref_primary_10_1109_TGRS_2025_3542422 crossref_primary_10_1109_JSTARS_2025_3542255 crossref_primary_10_1109_LGRS_2023_3323706 crossref_primary_10_1109_JIOT_2024_3420789 crossref_primary_10_1080_01431161_2024_2347525 crossref_primary_10_1109_TGRS_2025_3605373 crossref_primary_10_1109_TGRS_2022_3199467 crossref_primary_10_1109_TGRS_2022_3179419 crossref_primary_10_3390_bioengineering11030278 crossref_primary_10_1016_j_isprsjprs_2023_06_014 crossref_primary_10_1109_TGRS_2025_3548654 crossref_primary_10_3390_rs16060937 crossref_primary_10_1109_TGRS_2023_3324977 crossref_primary_10_1109_TGRS_2024_3475631 crossref_primary_10_1109_TGRS_2023_3322558 crossref_primary_10_1080_01431161_2024_2370501 crossref_primary_10_3389_fdgth_2025_1547208 crossref_primary_10_1016_j_jag_2025_104469 crossref_primary_10_1109_ACCESS_2022_3163535 crossref_primary_10_1016_j_bspc_2024_107050 crossref_primary_10_1109_TGRS_2025_3543861 crossref_primary_10_1007_s10489_023_04960_3 crossref_primary_10_1109_TGRS_2024_3443953 crossref_primary_10_1109_TCSVT_2022_3218284 crossref_primary_10_1109_JSTARS_2023_3265677 crossref_primary_10_1007_s11227_025_07653_5 crossref_primary_10_1109_TGRS_2024_3446666 crossref_primary_10_1109_TGRS_2024_3481875 crossref_primary_10_1109_TGRS_2025_3555967 crossref_primary_10_1109_TGRS_2023_3320146 crossref_primary_10_1109_JSTARS_2024_3476333 crossref_primary_10_1109_TGRS_2022_3169970 crossref_primary_10_1038_s41598_025_87030_7 crossref_primary_10_1109_TGRS_2024_3451457 |
| Cites_doi | 10.1109/TPAMI.2012.120 10.1109/TGRS.2007.895416 10.1109/IGARSS.2015.7326945 10.1109/ICCV.2017.89 10.3390/rs10071068 10.1080/01431160902926681 10.4314/wsa.v33i2.49049 10.1109/TGRS.2017.2698503 10.1109/TGRS.2012.2201730 10.1109/CVPR.2016.91 10.1109/TGRS.2019.2949180 10.1007/s11263-014-0744-2 10.1109/TGRS.2020.2974134 10.1109/JSTARS.2013.2262926 10.1109/TGRS.2017.2778343 10.1109/LGRS.2018.2869563 10.1109/JSTARS.2014.2329330 10.3390/rs11080963 10.1109/MSP.2012.2235192 10.1109/TGRS.2011.2176341 10.1117/1.JBO.19.1.010901 10.1109/TGRS.2010.2060550 10.1109/JSTARS.2014.2362116 10.3390/rs6065795 10.1109/TGRS.2017.2686842 10.1109/TGRS.2017.2781805 10.1109/TGRS.2014.2365676 10.3390/rs9010067 10.1109/TGRS.2015.2392755 10.3390/rs12030582 10.1016/j.patcog.2010.01.016 10.1109/MGRS.2016.2616418 10.1109/TGRS.2018.2832228 10.1016/j.patcog.2004.01.006 10.1109/TGRS.2017.2755542 10.1023/B:VISI.0000022288.19776.77 10.1007/978-3-319-95957-3_10 10.1109/TGRS.2019.2957135 10.1155/2016/3632943 10.1109/LGRS.2019.2918719 10.1109/TGRS.2017.2691906 10.1038/nature24270 10.1109/TGRS.2011.2129595 10.1109/TGRS.2014.2315209 10.1109/TGRS.2018.2866190 10.1109/TGRS.2016.2636241 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| DBID | 97E RIA RIE AAYXX CITATION 7UA 8FD C1K F1W FR3 H8D H96 KR7 L.G L7M |
| DOI | 10.1109/TGRS.2020.3037361 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005-present IEEE All-Society Periodicals Package (ASPP) 1998-Present IEEE Electronic Library (IEL) CrossRef Water Resources Abstracts Technology Research Database Environmental Sciences and Pollution Management ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Aerospace Database Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Advanced Technologies Database with Aerospace |
| DatabaseTitle | CrossRef Aerospace Database Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Technology Research Database ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Advanced Technologies Database with Aerospace Water Resources Abstracts Environmental Sciences and Pollution Management |
| DatabaseTitleList | Aerospace Database |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Physics |
| EISSN | 1558-0644 |
| EndPage | 8671 |
| ExternalDocumentID | 10_1109_TGRS_2020_3037361 9268479 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 61871226; 62001226; 61571230 funderid: 10.13039/501100001809 – fundername: Fundamental Research Funds for the Central Universities grantid: 30918011104; 30920021134 funderid: 10.13039/501100012226 – fundername: National Major Research Plan of China grantid: 2016YFF0103604 – fundername: Jiangsu Provincial Social Developing Project grantid: BE2018727 |
| GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AETIX AFRAH AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 Y6R AAYXX CITATION 7UA 8FD C1K F1W FR3 H8D H96 KR7 L.G L7M |
| ID | FETCH-LOGICAL-c293t-b6c5d72e813aed1af7555d30f201b3f27c2c7c9e0646581c27c5f46fe1fda0e03 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 330 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000698968700048&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0196-2892 |
| IngestDate | Mon Jun 30 08:29:36 EDT 2025 Sat Nov 29 02:50:09 EST 2025 Tue Nov 18 21:28:58 EST 2025 Wed Aug 27 02:27:14 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 10 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c293t-b6c5d72e813aed1af7555d30f201b3f27c2c7c9e0646581c27c5f46fe1fda0e03 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0003-0134-9450 0000-0002-1234-0614 0000-0002-4841-6051 0000-0003-0178-9384 |
| PQID | 2575980574 |
| PQPubID | 85465 |
| PageCount | 15 |
| ParticipantIDs | proquest_journals_2575980574 ieee_primary_9268479 crossref_primary_10_1109_TGRS_2020_3037361 crossref_citationtrail_10_1109_TGRS_2020_3037361 |
| PublicationCentury | 2000 |
| PublicationDate | 2021-10-01 |
| PublicationDateYYYYMMDD | 2021-10-01 |
| PublicationDate_xml | – month: 10 year: 2021 text: 2021-10-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | New York |
| PublicationPlace_xml | – name: New York |
| PublicationTitle | IEEE transactions on geoscience and remote sensing |
| PublicationTitleAbbrev | TGRS |
| PublicationYear | 2021 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref12 ref15 ref14 ref53 ref55 ref11 ref10 ref17 ref16 ref19 ref18 silver (ref24) 2017; 550 maas (ref50) 2013; 30 ying (ref30) 2017; 9 jampani (ref46) 2018 chen (ref6) 2014; 6 ref45 ref47 ref42 ref41 ref43 ref8 ref7 devlin (ref23) 2019 ref9 ref4 ref3 ref5 ref40 ref35 ref34 ref37 ref36 ref31 ref33 ref32 ref2 ref1 ref39 ref38 kipf (ref49) 2017 liu (ref44) 2020; 48 vedaldi (ref52) 2008 ref26 ref25 ref20 ref22 ref21 ref28 ref27 ref29 li (ref54) 2015 kingma (ref51) 2015 defferrard (ref48) 2016 |
| References_xml | – ident: ref45 doi: 10.1109/TPAMI.2012.120 – ident: ref17 doi: 10.1109/TGRS.2007.895416 – volume: 48 start-page: 751 year: 2020 ident: ref44 article-title: SSCDenseNet: A spectral-spatial convolutional dense network for hyperspectral image classification publication-title: Acta Electronica Sinica – ident: ref25 doi: 10.1109/IGARSS.2015.7326945 – ident: ref40 doi: 10.1109/ICCV.2017.89 – start-page: 1 year: 2017 ident: ref49 article-title: Semi-supervised classification with graph convolutional networks publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref34 doi: 10.3390/rs10071068 – volume: 30 start-page: 3 year: 2013 ident: ref50 article-title: Rectifier nonlinearities improve neural network acoustic models publication-title: Proc Int Conf Mach Learn (ICML) – ident: ref4 doi: 10.1080/01431160902926681 – start-page: 1 year: 2015 ident: ref51 article-title: Adam: A method for stochastic optimization publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref2 doi: 10.4314/wsa.v33i2.49049 – ident: ref26 doi: 10.1109/TGRS.2017.2698503 – ident: ref11 doi: 10.1109/TGRS.2012.2201730 – ident: ref22 doi: 10.1109/CVPR.2016.91 – ident: ref42 doi: 10.1109/TGRS.2019.2949180 – ident: ref55 doi: 10.1007/s11263-014-0744-2 – ident: ref39 doi: 10.1109/TGRS.2020.2974134 – start-page: 1356 year: 2015 ident: ref54 article-title: Superpixel segmentation using linear spectral clustering publication-title: Proc IEEE Conf Comput Vis Pattern Recognit (CVPR) – ident: ref7 doi: 10.1109/JSTARS.2013.2262926 – ident: ref32 doi: 10.1109/TGRS.2017.2778343 – ident: ref41 doi: 10.1109/LGRS.2018.2869563 – ident: ref29 doi: 10.1109/JSTARS.2014.2329330 – ident: ref37 doi: 10.3390/rs11080963 – ident: ref47 doi: 10.1109/MSP.2012.2235192 – start-page: 705 year: 2008 ident: ref52 article-title: Quick shift and kernel methods for mode seeking publication-title: Proc Eur Conf Comput Vis (ECCV) – ident: ref8 doi: 10.1109/TGRS.2011.2176341 – start-page: 352 year: 2018 ident: ref46 article-title: Superpixel sampling networks publication-title: Proc Eur Conf Comput Vis (ECCV) – ident: ref3 doi: 10.1117/1.JBO.19.1.010901 – ident: ref5 doi: 10.1109/TGRS.2010.2060550 – start-page: 3844 year: 2016 ident: ref48 article-title: Convolutional neural networks on graphs with fast localized spectral filtering publication-title: Proc Adv Neural Inf Process Syst (NIPS) – ident: ref12 doi: 10.1109/JSTARS.2014.2362116 – volume: 6 start-page: 5795 year: 2014 ident: ref6 article-title: Spectral-spatial classification of hyperspectral image based on kernel extreme learning machine publication-title: Remote Sens doi: 10.3390/rs6065795 – ident: ref13 doi: 10.1109/TGRS.2017.2686842 – ident: ref14 doi: 10.1109/TGRS.2017.2781805 – ident: ref18 doi: 10.1109/TGRS.2014.2365676 – volume: 9 start-page: 67 year: 2017 ident: ref30 article-title: Spectral-spatial classification of hyperspectral imagery with 3D convolutional neural network publication-title: Remote Sens doi: 10.3390/rs9010067 – ident: ref19 doi: 10.1109/TGRS.2015.2392755 – ident: ref38 doi: 10.3390/rs12030582 – ident: ref21 doi: 10.1016/j.patcog.2010.01.016 – ident: ref1 doi: 10.1109/MGRS.2016.2616418 – ident: ref35 doi: 10.1109/TGRS.2018.2832228 – ident: ref20 doi: 10.1016/j.patcog.2004.01.006 – ident: ref33 doi: 10.1109/TGRS.2017.2755542 – ident: ref53 doi: 10.1023/B:VISI.0000022288.19776.77 – ident: ref31 doi: 10.1007/978-3-319-95957-3_10 – start-page: 4171 year: 2019 ident: ref23 article-title: BERT: Pre-training of deep bidirectional transformers for language understanding publication-title: Proc North Amer Chapter Assoc Comput Linguistics (NAACL) – ident: ref16 doi: 10.1109/TGRS.2019.2957135 – ident: ref36 doi: 10.1155/2016/3632943 – ident: ref27 doi: 10.1109/LGRS.2019.2918719 – ident: ref43 doi: 10.1109/TGRS.2017.2691906 – volume: 550 start-page: 354 year: 2017 ident: ref24 article-title: Mastering the game of go without human knowledge publication-title: Nature doi: 10.1038/nature24270 – ident: ref9 doi: 10.1109/TGRS.2011.2129595 – ident: ref10 doi: 10.1109/TGRS.2014.2315209 – ident: ref15 doi: 10.1109/TGRS.2018.2866190 – ident: ref28 doi: 10.1109/TGRS.2016.2636241 |
| SSID | ssj0014517 |
| Score | 2.7112114 |
| Snippet | Recently, the graph convolutional network (GCN) has drawn increasing attention in the hyperspectral image (HSI) classification. Compared with the convolutional... |
| SourceID | proquest crossref ieee |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 8657 |
| SubjectTerms | Artificial neural networks Classification Coders Computational modeling Computer applications Convolution Convolutional neural network (CNN) Data Data processing Decoding Deep learning Feature extraction feature fusion graph convolutional network (GCN) graph encoder and decoder hyperspectral image (HSI) classification Hyperspectral imaging Image classification Incompatibility Kernel Learning Machine learning Methods Neural networks Nodes Pixels Regions Training Training data |
| Title | CNN-Enhanced Graph Convolutional Network With Pixel- and Superpixel-Level Feature Fusion for Hyperspectral Image Classification |
| URI | https://ieeexplore.ieee.org/document/9268479 https://www.proquest.com/docview/2575980574 |
| Volume | 59 |
| WOSCitedRecordID | wos000698968700048&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1558-0644 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014517 issn: 0196-2892 databaseCode: RIE dateStart: 19800101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Na9wwEB2S0EJz6EfS0G3SokNPpWosWbJWx7Bkk0IwoUnb3IwtjchC6l2865Bb_3olWVlaUgK92UaSDU-W5mnezAB8ELVVTqKklmukwuSaNozV1KF2ucptZmPGm-9nqizHV1f6fAM-rWNhEDGKz_BzuIy-fDs3fTgqO9QhNYnSm7CpVDHEaq09BkKyFBpdUE8iePJgskwfXp58vfBMkHuCmvn3F-yvPSgWVXmwEsftZfri_z7sJTxPZiQ5GnB_BRvY7sD2H8kFd-BpFHea5S78mpQlPW6vo7OfnIQU1WQyb2_TrPPjlIMYnPyYra7J-ewObyipW0su-gV2i3h_FtRFJFiMfYdk2odTNuItXnLqmewQsNn5kb789AsUiaU2gwgp4v4avk2PLyenNBVeoMbv_ivaFEZaxXHM8hotq52SUto8c95aaHLHleFGGY3enPEGDDP-gXSicMicrTPM8j3YauctvgGSy6ZpauFpmkNRo2marBhrIaQLRIbLEWT3UFQmZSUPxTFuqshOMl0F9KqAXpXQG8HHdZfFkJLjsca7Aa51w4TUCA7u8a7ST7useChWOvYGrHj771778IwHSUvU8h3A1qrr8R08Mber2bJ7H-fjb4Kx3ug |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3Pb9MwFH4aAwQ78GMD0W2AD5wQZrZjJ_VxqtZ1okQTK7BblNjPWqWRVmkzceNfx3ayCgRC4pZEtmXpc-z3-X3vPYA3srSZU6ioFRqpNImmFecldahdkiWW2Zjx5ss0y_Ph5aU-34J3m1gYRIziM3wfHqMv3y5MG67KjnRITZLpO3BXSSlYF6218RlIxfvg6JR6GiF6HyZn-mh2-unCc0HhKSrzM0j5b6dQLKvyx14cD5jx4_-b2hN41BuS5LhD_ilsYb0LO7-kF9yF-1HeaVZ78GOU5_SkvorufnIaklST0aK-6dedHyfv5ODk63x9Rc7n3_GakrK25KJdYrOM79OgLyLBZmwbJOM23LMRb_OSieeyXchm40c6--a3KBKLbQYZUkT-GXwen8xGE9qXXqDGn_9rWqVG2UzgkCclWl66TCllE-a8vVAlTmRGmMxo9AaNN2G48R-Uk6lD7mzJkCXPYbte1PgCSKKqqiqlJ2oOZYmmqlg61FIqF6iMUANgt1AUps9LHspjXBeRnzBdBPSKgF7RozeAt5suyy4px78a7wW4Ng17pAZweIt30f-2q0KEcqVDb8LK_b_3eg0PJrOP02J6ln84gIciCFyisu8QttdNiy_hnrlZz1fNq7g2fwJAAeIv |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=CNN-Enhanced+Graph+Convolutional+Network+With+Pixel-+and+Superpixel-Level+Feature+Fusion+for+Hyperspectral+Image+Classification&rft.jtitle=IEEE+transactions+on+geoscience+and+remote+sensing&rft.au=Liu%2C+Qichao&rft.au=Liang%2C+Xiao&rft.au=Yang%2C+Jingxiang&rft.au=Wei%2C+Zhihui&rft.date=2021-10-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0196-2892&rft.eissn=1558-0644&rft.volume=59&rft.issue=10&rft.spage=8657&rft_id=info:doi/10.1109%2FTGRS.2020.3037361&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0196-2892&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0196-2892&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0196-2892&client=summon |