Hyperspectral Image Classification with Capsule Network Using Limited Training Samples

Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have shown superior performance to that of the conventional machine learning algorithms. Recently, a novel type of neural networks called capsule net...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Sensors (Basel, Switzerland) Ročník 18; číslo 9; s. 3153
Hlavní autoři: Deng, Fei, Pu, Shengliang, Chen, Xuehong, Shi, Yusheng, Yuan, Ting, Pu, Shengyan
Médium: Journal Article
Jazyk:angličtina
Vydáno: Switzerland MDPI AG 18.09.2018
MDPI
Témata:
ISSN:1424-8220, 1424-8220
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have shown superior performance to that of the conventional machine learning algorithms. Recently, a novel type of neural networks called capsule networks (CapsNets) was presented to improve the most advanced CNNs. In this paper, we present a modified two-layer CapsNet with limited training samples for HSI classification, which is inspired by the comparability and simplicity of the shallower deep learning models. The presented CapsNet is trained using two real HSI datasets, i.e., the PaviaU (PU) and SalinasA datasets, representing complex and simple datasets, respectively, and which are used to investigate the robustness or representation of every model or classifier. In addition, a comparable paradigm of network architecture design has been proposed for the comparison of CNN and CapsNet. Experiments demonstrate that CapsNet shows better accuracy and convergence behavior for the complex data than the state-of-the-art CNN. For CapsNet using the PU dataset, the Kappa coefficient, overall accuracy, and average accuracy are 0.9456, 95.90%, and 96.27%, respectively, compared to the corresponding values yielded by CNN of 0.9345, 95.11%, and 95.63%. Moreover, we observed that CapsNet has much higher confidence for the predicted probabilities. Subsequently, this finding was analyzed and discussed with probability maps and uncertainty analysis. In terms of the existing literature, CapsNet provides promising results and explicit merits in comparison with CNN and two baseline classifiers, i.e., random forests (RFs) and support vector machines (SVMs).
AbstractList Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have shown superior performance to that of the conventional machine learning algorithms. Recently, a novel type of neural networks called capsule networks (CapsNets) was presented to improve the most advanced CNNs. In this paper, we present a modified two-layer CapsNet with limited training samples for HSI classification, which is inspired by the comparability and simplicity of the shallower deep learning models. The presented CapsNet is trained using two real HSI datasets, i.e., the PaviaU (PU) and SalinasA datasets, representing complex and simple datasets, respectively, and which are used to investigate the robustness or representation of every model or classifier. In addition, a comparable paradigm of network architecture design has been proposed for the comparison of CNN and CapsNet. Experiments demonstrate that CapsNet shows better accuracy and convergence behavior for the complex data than the state-of-the-art CNN. For CapsNet using the PU dataset, the Kappa coefficient, overall accuracy, and average accuracy are 0.9456, 95.90%, and 96.27%, respectively, compared to the corresponding values yielded by CNN of 0.9345, 95.11%, and 95.63%. Moreover, we observed that CapsNet has much higher confidence for the predicted probabilities. Subsequently, this finding was analyzed and discussed with probability maps and uncertainty analysis. In terms of the existing literature, CapsNet provides promising results and explicit merits in comparison with CNN and two baseline classifiers, i.e., random forests (RFs) and support vector machines (SVMs).
Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have shown superior performance to that of the conventional machine learning algorithms. Recently, a novel type of neural networks called capsule networks (CapsNets) was presented to improve the most advanced CNNs. In this paper, we present a modified two-layer CapsNet with limited training samples for HSI classification, which is inspired by the comparability and simplicity of the shallower deep learning models. The presented CapsNet is trained using two real HSI datasets, i.e., the PaviaU (PU) and SalinasA datasets, representing complex and simple datasets, respectively, and which are used to investigate the robustness or representation of every model or classifier. In addition, a comparable paradigm of network architecture design has been proposed for the comparison of CNN and CapsNet. Experiments demonstrate that CapsNet shows better accuracy and convergence behavior for the complex data than the state-of-the-art CNN. For CapsNet using the PU dataset, the Kappa coefficient, overall accuracy, and average accuracy are 0.9456, 95.90%, and 96.27%, respectively, compared to the corresponding values yielded by CNN of 0.9345, 95.11%, and 95.63%. Moreover, we observed that CapsNet has much higher confidence for the predicted probabilities. Subsequently, this finding was analyzed and discussed with probability maps and uncertainty analysis. In terms of the existing literature, CapsNet provides promising results and explicit merits in comparison with CNN and two baseline classifiers, i.e., random forests (RFs) and support vector machines (SVMs).Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have shown superior performance to that of the conventional machine learning algorithms. Recently, a novel type of neural networks called capsule networks (CapsNets) was presented to improve the most advanced CNNs. In this paper, we present a modified two-layer CapsNet with limited training samples for HSI classification, which is inspired by the comparability and simplicity of the shallower deep learning models. The presented CapsNet is trained using two real HSI datasets, i.e., the PaviaU (PU) and SalinasA datasets, representing complex and simple datasets, respectively, and which are used to investigate the robustness or representation of every model or classifier. In addition, a comparable paradigm of network architecture design has been proposed for the comparison of CNN and CapsNet. Experiments demonstrate that CapsNet shows better accuracy and convergence behavior for the complex data than the state-of-the-art CNN. For CapsNet using the PU dataset, the Kappa coefficient, overall accuracy, and average accuracy are 0.9456, 95.90%, and 96.27%, respectively, compared to the corresponding values yielded by CNN of 0.9345, 95.11%, and 95.63%. Moreover, we observed that CapsNet has much higher confidence for the predicted probabilities. Subsequently, this finding was analyzed and discussed with probability maps and uncertainty analysis. In terms of the existing literature, CapsNet provides promising results and explicit merits in comparison with CNN and two baseline classifiers, i.e., random forests (RFs) and support vector machines (SVMs).
Author Pu, Shengliang
Deng, Fei
Chen, Xuehong
Shi, Yusheng
Yuan, Ting
Pu, Shengyan
AuthorAffiliation 3 State Environmental Protection Key Laboratory of Satellites Remote Sensing, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, Beijing 100101, China; shiys@radi.ac.cn
1 School of Geodesy and Geomatics, Wuhan University, Wuhan 430079, China; fdeng@sgg.whu.edu.cn (F.D.); shengliangpu@163.com (S.P.); ztyuan@whu.edu.cn (T.Y.)
4 State Key Laboratory of Geohazard Prevention and Geoenvironment Protection, Chengdu University of Technology, Chengdu 610059, China
2 State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875, China; cxh1216@gmail.com
AuthorAffiliation_xml – name: 1 School of Geodesy and Geomatics, Wuhan University, Wuhan 430079, China; fdeng@sgg.whu.edu.cn (F.D.); shengliangpu@163.com (S.P.); ztyuan@whu.edu.cn (T.Y.)
– name: 3 State Environmental Protection Key Laboratory of Satellites Remote Sensing, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, Beijing 100101, China; shiys@radi.ac.cn
– name: 2 State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875, China; cxh1216@gmail.com
– name: 4 State Key Laboratory of Geohazard Prevention and Geoenvironment Protection, Chengdu University of Technology, Chengdu 610059, China
Author_xml – sequence: 1
  givenname: Fei
  surname: Deng
  fullname: Deng, Fei
– sequence: 2
  givenname: Shengliang
  orcidid: 0000-0003-4685-5621
  surname: Pu
  fullname: Pu, Shengliang
– sequence: 3
  givenname: Xuehong
  surname: Chen
  fullname: Chen, Xuehong
– sequence: 4
  givenname: Yusheng
  orcidid: 0000-0001-7256-3628
  surname: Shi
  fullname: Shi, Yusheng
– sequence: 5
  givenname: Ting
  surname: Yuan
  fullname: Yuan, Ting
– sequence: 6
  givenname: Shengyan
  surname: Pu
  fullname: Pu, Shengyan
BackLink https://www.ncbi.nlm.nih.gov/pubmed/30231574$$D View this record in MEDLINE/PubMed
BookMark eNplkk1v1DAQhq2qqJ8c-AMoEhc4LI0_kjgXpGoFdKUVHNpytWad8daLYwc7oeq_x9ttq7b44tH4nWfGM3NM9n3wSMg7Wn7mvC3PEpVly2nF98gRFUzMJGPl_jP7kByntClLxjmXB-SQZ4tWjTgivy7uBoxpQD1GcMWihzUWcwcpWWM1jDb44taON8UchjQ5LH7geBvi7-I6Wb8ulra3I3bFVQTrt45L6AeH6ZS8MeASvn24T8j1t69X84vZ8uf3xfx8OdOibscZpYCC0qqWAlHSrqq0bBC5FtQANrrqmJHQdqhNuzKmFRKbRgO0ZiU6TQ0_IYsdtwuwUUO0PcQ7FcCqe0eIawVxtNqh4lgBdlRnLMscI1vsAEDKkpk6J8-sLzvWMK167DT6bUteQF--eHuj1uGvqmld5S9kwMcHQAx_Jkyj6m3S6Bx4DFNSjOZTiZptpR9eSTdhij63KqtYLZtaCJFV759X9FTK4_iy4Gwn0DGkFNEobcf7oeUCrVO0VNsFUU8LkiM-vYp4hP6v_Qd717xl
CitedBy_id crossref_primary_10_1049_ipr2_12330
crossref_primary_10_3390_rs14184639
crossref_primary_10_3389_fcomp_2020_00024
crossref_primary_10_1088_2515_7620_ade6d0
crossref_primary_10_1117_1_JEI_31_6_061817
crossref_primary_10_1049_iet_ipr_2019_0869
crossref_primary_10_3390_app10113769
crossref_primary_10_1109_TGRS_2020_3040203
crossref_primary_10_2478_ijanmc_2024_0003
crossref_primary_10_1155_2022_2836486
crossref_primary_10_3390_rs12203294
crossref_primary_10_1109_JSEN_2024_3386173
crossref_primary_10_1007_s10489_020_01725_0
crossref_primary_10_1109_TSM_2021_3134625
crossref_primary_10_1007_s11042_023_15017_5
crossref_primary_10_3390_rs13091629
crossref_primary_10_1109_JSTARS_2020_2968930
crossref_primary_10_3390_rs13030526
crossref_primary_10_1016_j_engappai_2023_106017
crossref_primary_10_1109_TGRS_2019_2945255
crossref_primary_10_1109_TGRS_2021_3049292
crossref_primary_10_1038_s41598_020_68453_w
crossref_primary_10_1007_s00521_024_09527_y
crossref_primary_10_1016_j_neucom_2022_08_073
crossref_primary_10_1038_s41598_020_70916_z
crossref_primary_10_3390_rs13142646
crossref_primary_10_1109_TGRS_2025_3566399
crossref_primary_10_3389_fnbot_2021_648374
crossref_primary_10_3390_rs14071652
crossref_primary_10_1109_JSTARS_2021_3101511
crossref_primary_10_1109_TGRS_2024_3361906
crossref_primary_10_1007_s42044_023_00169_2
crossref_primary_10_1109_ACCESS_2024_3460394
crossref_primary_10_1109_LGRS_2020_2991405
crossref_primary_10_32604_cmc_2023_032429
crossref_primary_10_1007_s10462_023_10466_8
crossref_primary_10_1109_JSTARS_2019_2916495
crossref_primary_10_3390_app14209499
crossref_primary_10_3390_math9222984
crossref_primary_10_1109_TGRS_2020_3028223
crossref_primary_10_32604_cmc_2022_025202
crossref_primary_10_1080_23311916_2023_2203890
crossref_primary_10_3390_rs15071871
crossref_primary_10_1016_j_isprsjprs_2019_09_006
crossref_primary_10_1109_JSTSP_2019_2902305
crossref_primary_10_1080_07038992_2023_2248270
crossref_primary_10_3390_rs11050494
crossref_primary_10_1080_01431161_2022_2133579
crossref_primary_10_1007_s10489_024_06139_w
crossref_primary_10_3390_rs12091395
crossref_primary_10_1016_j_neucom_2024_129027
crossref_primary_10_1109_TGRS_2023_3304836
crossref_primary_10_1109_TPWRD_2021_3049861
crossref_primary_10_1109_JSTARS_2020_3046414
crossref_primary_10_1109_ACCESS_2021_3111168
crossref_primary_10_1007_s41870_022_01075_9
crossref_primary_10_1016_j_infrared_2023_104983
crossref_primary_10_1007_s00521_023_08664_0
crossref_primary_10_3390_app122211393
crossref_primary_10_1049_ell2_70392
crossref_primary_10_3390_rs13163132
crossref_primary_10_1007_s11571_021_09777_9
crossref_primary_10_3390_s19030718
crossref_primary_10_1109_ACCESS_2020_3029174
crossref_primary_10_1007_s44174_025_00424_z
crossref_primary_10_1109_LGRS_2020_2983196
crossref_primary_10_3390_s24144714
crossref_primary_10_1002_ima_22569
crossref_primary_10_1109_TGRS_2021_3135506
crossref_primary_10_3389_fpls_2022_1002312
crossref_primary_10_1007_s00500_020_04933_5
crossref_primary_10_1109_TGRS_2019_2912468
crossref_primary_10_1038_s41598_021_97901_4
crossref_primary_10_1007_s11042_021_11203_5
crossref_primary_10_1016_j_cosrev_2023_100584
crossref_primary_10_3390_electronics13101863
crossref_primary_10_1186_s40494_021_00614_0
crossref_primary_10_1016_j_bspc_2024_106636
crossref_primary_10_1007_s11042_023_15444_4
crossref_primary_10_3390_rs13132445
crossref_primary_10_1093_comjnl_bxae011
crossref_primary_10_1016_j_jag_2021_102603
crossref_primary_10_1007_s00500_022_07446_5
crossref_primary_10_1007_s10489_020_02180_7
crossref_primary_10_1109_TGRS_2019_2918080
crossref_primary_10_1016_j_jag_2023_103614
crossref_primary_10_1109_JSTARS_2021_3126427
crossref_primary_10_3390_rs11030223
crossref_primary_10_1007_s13246_023_01259_y
crossref_primary_10_1109_LGRS_2020_2979604
crossref_primary_10_1109_ACCESS_2024_3390558
crossref_primary_10_1145_3674500
crossref_primary_10_1002_ima_22706
crossref_primary_10_1049_iet_rsn_2020_0241
crossref_primary_10_3390_s21165575
crossref_primary_10_1007_s00521_023_09058_y
crossref_primary_10_3390_s25113293
crossref_primary_10_1038_s41598_025_87030_7
crossref_primary_10_1007_s10707_023_00492_7
Cites_doi 10.1109/LGRS.2017.2681128
10.1109/34.709601
10.1109/TGRS.2016.2584107
10.1155/2015/258619
10.1109/JSTARS.2015.2489838
10.1038/nature14539
10.3390/rs9060629
10.1109/MGRS.2013.2244672
10.1109/JSTARS.2015.2406339
10.1111/tgis.12164
10.3389/fninf.2018.00062
10.1109/IGARSS.2017.8127330
10.1109/ICASSP.2018.8461418
10.1109/IGARSS.2015.7326945
10.3390/rs9010067
10.1145/3178876.3186015
10.3390/rs10020299
10.3390/rs71114680
10.1109/TGRS.2017.2765364
10.1016/j.rse.2014.11.001
10.1007/978-3-030-11015-4_38
10.1109/79.974718
10.1016/j.neucom.2016.09.010
10.1109/TGRS.2016.2616355
10.1109/LGRS.2014.2375188
10.1109/LGRS.2010.2068537
10.1109/TGRS.2004.842481
10.1109/JSTARS.2016.2615073
10.1007/978-1-4614-0143-8
10.1109/ICIP.2018.8451379
10.1016/j.rse.2010.03.003
10.1016/j.rse.2007.07.028
10.1109/TGRS.2005.846154
10.1109/ICALIP.2018.8455251
10.1109/JSTARS.2014.2329330
10.3390/rs10050783
10.1016/j.rse.2014.08.038
10.1109/TGRS.2004.831865
10.1109/JSTARS.2013.2257422
10.1109/TGRS.2017.2755542
10.1080/10095020.2012.719684
ContentType Journal Article
Copyright 2018. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2018 by the authors. 2018
Copyright_xml – notice: 2018. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2018 by the authors. 2018
DBID AAYXX
CITATION
NPM
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
PRINS
7X8
5PM
DOA
DOI 10.3390/s18093153
DatabaseName CrossRef
PubMed
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
ProQuest Hospital Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
ProQuest Health & Medical Collection
Medical Database
Proquest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
PubMed Central (Full Participant titles)
Directory of Open Access Journals
DatabaseTitle CrossRef
PubMed
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Central China
ProQuest Central
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList PubMed

MEDLINE - Academic
Publicly Available Content Database
CrossRef

Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: PIMPY
  name: Publicly Available Content Database
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_3e5aed1ce7c248ef89edaaa8802f6e81
PMC6165568
30231574
10_3390_s18093153
Genre Journal Article
GeographicLocations Beijing China
United States--US
China
GeographicLocations_xml – name: China
– name: Beijing China
– name: United States--US
GrantInformation_xml – fundername: Research Fund of State Key Laboratory of Geohazard Prevention and Geoenvironment Protection
  grantid: SKLGP2018Z006
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
ADRAZ
AENEX
AFFHD
AFKRA
AFZYC
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
IPNFZ
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PJZUB
PPXIY
PQQKQ
PROAC
PSQYO
RIG
RNS
RPM
TUS
UKHRP
XSB
~8M
ALIPV
NPM
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PKEHL
PQEST
PQUKI
PRINS
7X8
PUEGO
5PM
ID FETCH-LOGICAL-c469t-11ae4115684ee81d55c87ee3c41fae7c5d2f8a9decf9bff948e77caa9fb4dc1f3
IEDL.DBID DOA
ISICitedReferencesCount 130
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000446940600412&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1424-8220
IngestDate Tue Oct 14 19:06:35 EDT 2025
Tue Nov 04 02:03:32 EST 2025
Thu Oct 02 03:06:02 EDT 2025
Sat Nov 29 14:42:31 EST 2025
Thu Apr 03 07:05:45 EDT 2025
Sat Nov 29 07:12:25 EST 2025
Tue Nov 18 21:50:40 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 9
Keywords capsule network
deep learning
image classification
possibility density
hyperspectral
Language English
License Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c469t-11ae4115684ee81d55c87ee3c41fae7c5d2f8a9decf9bff948e77caa9fb4dc1f3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-4685-5621
0000-0001-7256-3628
OpenAccessLink https://doaj.org/article/3e5aed1ce7c248ef89edaaa8802f6e81
PMID 30231574
PQID 2126876444
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_3e5aed1ce7c248ef89edaaa8802f6e81
pubmedcentral_primary_oai_pubmedcentral_nih_gov_6165568
proquest_miscellaneous_2111154628
proquest_journals_2126876444
pubmed_primary_30231574
crossref_citationtrail_10_3390_s18093153
crossref_primary_10_3390_s18093153
PublicationCentury 2000
PublicationDate 2018-09-18
PublicationDateYYYYMMDD 2018-09-18
PublicationDate_xml – month: 09
  year: 2018
  text: 2018-09-18
  day: 18
PublicationDecade 2010
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
– name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationTitleAlternate Sensors (Basel)
PublicationYear 2018
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References Melgani (ref_15) 2004; 42
Yan (ref_5) 2015; 158
ref_14
ref_11
ref_10
ref_54
Yu (ref_27) 2017; 219
Zhong (ref_43) 2018; 56
Chen (ref_50) 2011; 8
Kussul (ref_46) 2017; 14
Chutia (ref_17) 2016; 20
ref_19
Landgrebe (ref_4) 2002; 19
Chen (ref_18) 2017; 7
ref_24
ref_22
Li (ref_26) 2017; 55
Zhang (ref_3) 2012; 15
ref_20
LeCun (ref_23) 2015; 521
Hu (ref_45) 2015; 7
ref_29
Zhong (ref_28) 2015; 12
Bruzzone (ref_49) 2005; 43
Ho (ref_13) 1998; 20
Schneider (ref_51) 2010; 114
ref_36
ref_35
ref_34
ref_32
Makantasis (ref_33) 2018; 99
Hu (ref_21) 2015; 2015
ref_31
ref_30
Plaza (ref_2) 2009; 113
Govender (ref_8) 2007; 33
Plaza (ref_16) 2013; 1
ref_39
ref_38
ref_37
Gevaert (ref_7) 2015; 8
Du (ref_1) 2013; 6
Ham (ref_12) 2005; 43
ref_47
ref_44
He (ref_9) 2018; 56
ref_42
ref_41
ref_40
Sexton (ref_52) 2015; 156
Yang (ref_53) 2017; 10
Eslami (ref_6) 2016; 9
Chen (ref_25) 2016; 54
ref_48
References_xml – volume: 99
  start-page: 1
  year: 2018
  ident: ref_33
  article-title: Tensor-Based Classification Models for Hyperspectral Data Analysis
  publication-title: IEEE Trans. Geosci. Remote Sens.
– volume: 14
  start-page: 778
  year: 2017
  ident: ref_46
  article-title: Deep learning classification of land cover and crop types using remote sensing data
  publication-title: IEEE Geosci. Remote Sens. Lett.
  doi: 10.1109/LGRS.2017.2681128
– volume: 20
  start-page: 832
  year: 1998
  ident: ref_13
  article-title: The Random Subspace Method for Constructing Decision Forests
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/34.709601
– volume: 54
  start-page: 6232
  year: 2016
  ident: ref_25
  article-title: Deep feature extraction and classification of hyperspectral images based on convolutional neural networks
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2016.2584107
– volume: 33
  start-page: 145
  year: 2007
  ident: ref_8
  article-title: A review of hyperspectral remote sensing and its application in vegetation and water resource studies
  publication-title: Water SA
– volume: 2015
  start-page: 258619
  year: 2015
  ident: ref_21
  article-title: Deep convolutional neural networks for hyperspectral image classification
  publication-title: J. Sens.
  doi: 10.1155/2015/258619
– volume: 9
  start-page: 1808
  year: 2016
  ident: ref_6
  article-title: Developing a spectral-based strategy for urban object detection from airborne hyperspectral TIR and visible data
  publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
  doi: 10.1109/JSTARS.2015.2489838
– volume: 521
  start-page: 436
  year: 2015
  ident: ref_23
  article-title: Deep learning
  publication-title: Nature
  doi: 10.1038/nature14539
– ident: ref_44
  doi: 10.3390/rs9060629
– ident: ref_42
– volume: 1
  start-page: 6
  year: 2013
  ident: ref_16
  article-title: Hyperspectral Remote Sensing Data Analysis and Future Challenges
  publication-title: IEEE Geosci. Remote Sens. Mag.
  doi: 10.1109/MGRS.2013.2244672
– volume: 8
  start-page: 3140
  year: 2015
  ident: ref_7
  article-title: Generation of Spectral–Temporal Response Surfaces by Combining Multispectral Satellite and Hyperspectral UAV Imagery for Precision Agriculture Applications
  publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
  doi: 10.1109/JSTARS.2015.2406339
– volume: 20
  start-page: 463
  year: 2016
  ident: ref_17
  article-title: Hyperspectral Remote Sensing Classifications: A Perspective Survey
  publication-title: Trans. GIS
  doi: 10.1111/tgis.12164
– ident: ref_39
  doi: 10.3389/fninf.2018.00062
– ident: ref_48
  doi: 10.1109/IGARSS.2017.8127330
– ident: ref_20
  doi: 10.1109/ICASSP.2018.8461418
– ident: ref_31
– ident: ref_19
  doi: 10.1109/IGARSS.2015.7326945
– ident: ref_29
  doi: 10.3390/rs9010067
– ident: ref_40
  doi: 10.1145/3178876.3186015
– ident: ref_41
– ident: ref_47
  doi: 10.3390/rs10020299
– ident: ref_38
– volume: 7
  start-page: 14680
  year: 2015
  ident: ref_45
  article-title: Transferring deep convolutional neural networks for the scene classification of high-resolution remote sensing imagery
  publication-title: Remote Sens.
  doi: 10.3390/rs71114680
– volume: 56
  start-page: 1579
  year: 2018
  ident: ref_9
  article-title: Recent Advances on Spectral-Spatial Hyperspectral Image Classification: An Overview and New Guidelines
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2017.2765364
– volume: 158
  start-page: 295
  year: 2015
  ident: ref_5
  article-title: Urban land cover classification using airborne LiDAR data: A review
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2014.11.001
– ident: ref_35
  doi: 10.1007/978-3-030-11015-4_38
– ident: ref_30
– volume: 19
  start-page: 17
  year: 2002
  ident: ref_4
  article-title: Hyperspectral image data analysis
  publication-title: IEEE Signal Process. Mag.
  doi: 10.1109/79.974718
– volume: 219
  start-page: 88
  year: 2017
  ident: ref_27
  article-title: Convolutional neural networks for hyperspectral image classification
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2016.09.010
– ident: ref_24
– volume: 55
  start-page: 844
  year: 2017
  ident: ref_26
  article-title: Hyperspectral image classification using deep pixel-pair features
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2016.2616355
– volume: 12
  start-page: 1028
  year: 2015
  ident: ref_28
  article-title: Discriminant tensor spectral-spatial feature extraction for hyperspectral image classification
  publication-title: IEEE Geosci. Remote Sens. Lett.
  doi: 10.1109/LGRS.2014.2375188
– volume: 8
  start-page: 317
  year: 2011
  ident: ref_50
  article-title: Change vector analysis in posterior probability space: A new method for land cover change detection
  publication-title: IEEE Geosci. Remote Sens. Lett.
  doi: 10.1109/LGRS.2010.2068537
– volume: 43
  start-page: 492
  year: 2005
  ident: ref_12
  article-title: Investigation of the random forest framework for classification of hyperspectral data
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2004.842481
– ident: ref_37
– ident: ref_14
– volume: 10
  start-page: 1105
  year: 2017
  ident: ref_53
  article-title: Multiscale integration approach for land cover classification based on minimal entropy of posterior probability
  publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
  doi: 10.1109/JSTARS.2016.2615073
– ident: ref_10
  doi: 10.1007/978-1-4614-0143-8
– ident: ref_34
  doi: 10.1109/ICIP.2018.8451379
– volume: 114
  start-page: 1733
  year: 2010
  ident: ref_51
  article-title: Mapping global urban areas using MODIS 500-m data: New methods and datasets based on ‘urban ecoregions’
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2010.03.003
– volume: 113
  start-page: S110
  year: 2009
  ident: ref_2
  article-title: Recent advances in techniques for hyperspectral image processing
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2007.07.028
– volume: 43
  start-page: 1351
  year: 2005
  ident: ref_49
  article-title: Kernel-based methods for hyperspectral image classification
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2005.846154
– ident: ref_11
  doi: 10.1109/ICALIP.2018.8455251
– ident: ref_54
– volume: 7
  start-page: 2094
  year: 2017
  ident: ref_18
  article-title: Deep Learning-Based Classification of Hyperspectral Data
  publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
  doi: 10.1109/JSTARS.2014.2329330
– ident: ref_32
  doi: 10.3390/rs10050783
– volume: 156
  start-page: 418
  year: 2015
  ident: ref_52
  article-title: A model for the propagation of uncertainty from continuous estimates of tree cover to categorical forest cover and change
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2014.08.038
– volume: 42
  start-page: 1778
  year: 2004
  ident: ref_15
  article-title: Classification of hyperspectral remote sensing images with support vector machines
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2004.831865
– ident: ref_36
– volume: 6
  start-page: 459
  year: 2013
  ident: ref_1
  article-title: Foreword to the special issue on hyperspectral remote sensing: Theory, methods, and applications
  publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
  doi: 10.1109/JSTARS.2013.2257422
– ident: ref_22
– volume: 56
  start-page: 847
  year: 2018
  ident: ref_43
  article-title: Spectral-Spatial Residual Network for Hyperspectral Image Classification: A 3-D Deep Learning Framework
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2017.2755542
– volume: 15
  start-page: 143
  year: 2012
  ident: ref_3
  article-title: Recent advances in hyperspectral image processing
  publication-title: Geo-Spat. Inf. Sci.
  doi: 10.1080/10095020.2012.719684
SSID ssj0023338
Score 2.59456
Snippet Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 3153
SubjectTerms capsule network
Classification
deep learning
hyperspectral
image classification
Laboratories
Neural networks
possibility density
Probability
Remote sensing
World Wide Web
SummonAdditionalLinks – databaseName: ProQuest Central
  dbid: BENPR
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1NT9wwEB21Sw_tgX63oRSZqodeInDsJM4JAQLBZYUKVNwirz0BJJqlm6W_vzOON7AV6qXXeKJMNJ7xjD1-D-Ar5Ws5ZdUqNQXvVmncTq2cmNSjVkpnZVVZF8gmyvHYXFxUJ3HDrYttlYuYGAK1nzreI9-iEFuQ52qtd25_pcwaxaerkULjKawwUpkewcrewfjk-1ByKarAejwhRcX9VsdoVUrmamkVCmD9j2WYfzdKPlh5Dl_-r86vYDXmnGK3nySv4Qm2b-DFAyTCt_DjiOrR_tolaSCOf1KYEYEwk1uJgvUEb9mKfUtl9Q2Kcd8-LkLLgYjXpMRZJJwQp5ZRh7t3cH54cLZ_lEbKhdRRnTxPpbSoKUksjEakVDbPnSkRldOysVi63GeNsZVH11STpqm0wbJ01lbNRHsnG_UeRu20xY8gPC38JF1QwKCiTVtTKHS51fQFl0k_SeDbwgS1i3jkTItxU1NdwtaqB2sl8GUQve1BOB4T2mM7DgKMmx0eTGeXdXTDWmFu0UtHv5KR7o2p0FtrKYiRpvTDCawvLFlHZ-7qezMmsDkMkxvy2YptcXrHMpKBjYrMJPChnzSDJszLJPOS3i6XptOSqssj7fVVgPouZMEQcWv_VusTPKc8LrSxSLMOo_nsDj_DM_d7ft3NNqJP_AGRtRoN
  priority: 102
  providerName: ProQuest
Title Hyperspectral Image Classification with Capsule Network Using Limited Training Samples
URI https://www.ncbi.nlm.nih.gov/pubmed/30231574
https://www.proquest.com/docview/2126876444
https://www.proquest.com/docview/2111154628
https://pubmed.ncbi.nlm.nih.gov/PMC6165568
https://doaj.org/article/3e5aed1ce7c248ef89edaaa8802f6e81
Volume 18
WOSCitedRecordID wos000446940600412&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: DOA
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: M~E
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: Health & Medical Collection
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: 7X7
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: BENPR
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: PIMPY
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3BTtwwEB1R6KEcKtpSSKErF_XQSwSOHds5FrQIDqxWLVTbU-S1xwIJAmKXHvvtHTvZaLdC6qUXH-xJ4ozH9rxk_AbgM_lrJXnVIjcqfq2SeJRbPjW5RymELHRVWZeSTejRyEwm1Xgp1VeMCWvpgVvFHQosLXruULtCGgymQm-tJbMrgsJ06Logr2cBpjqoJQh5tTxCgkD94SyyVAleipXdJ5H0P-dZ_h0gubTjnG7B685VZF_bLr6BNWzewuYSgeA7-HFGMLI9LUk3YOd3tDqwlOcyRgAlpbP4pZWdWELDt8hGbdQ3S5ECrDvdxC67PBHsu41kwbNtuDodXp6c5V2mhNwRvJ3nnFuU5NspI5F04svSGY0onOTBkuZKXwRjK48uVNMQKlKk1s7aKkyldzyI97De3De4C8zTfk3SiuY5YS1pjRLoSivpCa7gfprBl4UGa9fRiMdsFrc1wYmo7LpXdgYHvehDy53xnNBxHIZeINJdpwoygrozgvpfRpDB_mIQ624OzmralBWt9fQSGXzqm2n2xF8itsH7pyjDIx-RKkwGO-2Y9z2J6ZR4qelqvWINK11dbWlurhNDt-IqMrt9-B_vtgevyElLMSrc7MP6_PEJP8JL92t-M3scwAs90ak0A9g4Ho7G3wZpKlB58XtIdePzi_HPP1ygEpc
linkProvider Directory of Open Access Journals
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9NAEB6VFIly4P0wFFgQSFysdr1re31ACApVorZRJAJqT2azOy6VihPiBMSf4jcyu37QoIpbD1ztSTxrfzueWc9-H8BzytdiyqpFqBK3WiVxO9R8okKLUggZpVmmjRebSIdDdXiYjdbgV7sXxrVVtjHRB2o7NW6NfItCbEIzV0r5evYtdKpR7utqK6FRw2IPf_6gkq16NXhHz_dFFO2-H-_0w0ZVIDRUCi5CzjVKyoMSJREpW4tjo1JEYSQvNKYmtlGhdGbRFNmkKDKpME2N1lkxkdbwQtD_XoJ1SWBXPVgfDQ5GR12JJ6jiq_mLhMi2tyrHjiV4LFbeel4c4LyM9u_GzDNvut3r_9s9ugHXmpyavaknwU1Yw_IWXD3DtHgbPvWp3q63ldKI2eArhVHmBUFdq5RHJ3NL0mxHz6rlKbJh3R7PfEsFa7aBsXEjqME-aMeqXN2BjxcysLvQK6cl3gdmKbEh64QCIhWlUqtEoIm1pCuYiNtJAC_bR56bhm_dyX6c5lR3OXTkHToCeNaZzmqSkfOM3jrcdAaOF9wfmM6P8ybM5AJjjZYbGkpEvhcqQ6u1piBNntKAA9hskZM3warK_8AmgKfdaQoz7tuRLnG6dDbcETclkQrgXg3SzhOnO8XjlH6drsB3xdXVM-XJF09lnvDEUeA9-LdbT-BKf3ywn-8PhnsPYYNyVt-yw9Um9BbzJT6Cy-b74qSaP27mI4PPFw3v3457e78
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V3fb9MwED6NDSF44OeAwACDQOIl6hw7if2AEGxUqwZVJQYaT5nrXLZJIy1Ny7R_jb-Oc-KEFU287YHX5Nqck8-X75zzdwAvia_FxKpFqBK3WiVxMzR8rMIcpRAySrU2tm42kQ6Han9fj1bgV7sXxpVVtjGxDtT5xLo18h6F2IRmrpSyV_iyiNF2_-30R-g6SLkvrW07jQYiu3h2Sulb9WawTc_6VRT1P-xt7YS-w0BoKS2ch5wblMSJEiURibnFsVUporCSFwZTG-dRoYzO0RZ6XBRaKkxTa4wuxjK3vBD0v1dgjSi5pDm2Nhp8Gn3r0j1B2V-jZSSE3uxVTilL8FgsvQHrRgEXsdu_izTPvfX6t_7n-3Ubbnquzd41k-MOrGB5F26cU2C8B193KA9vtpvS6NngO4VXVjcKdSVUNWqZW6pmW2ZaLU6QDZuyeVaXWjC_PYzt-UYb7LNxasvVOny5lIHdh9VyUuJDYDkRHrJOKFBSsiqNSgTa2Ei6go14Pg7gdfv4M-t12F07kJOM8jGHlKxDSgAvOtNpIz5ykdF7h6HOwOmF1wcms8PMh59MYGww55aGEpHvhdKYG2MoeJOnNOAANloUZT6IVdkfCAXwvDtN4cd9UzIlThbOhjtBpyRSATxoANt54vpR8TilX6dLUF5ydflMeXxUS5wnPHHSeI_-7dYzuEaYzj4OhruP4TpR2bqSh6sNWJ3PFvgErtqf8-Nq9tRPTQYHl43u3-OFhH8
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Hyperspectral+Image+Classification+with+Capsule+Network+Using+Limited+Training+Samples&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Deng%2C+Fei&rft.au=Pu%2C+Shengliang&rft.au=Chen%2C+Xuehong&rft.au=Shi%2C+Yusheng&rft.date=2018-09-18&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=18&rft.issue=9&rft.spage=3153&rft_id=info:doi/10.3390%2Fs18093153&rft.externalDBID=n%2Fa&rft.externalDocID=10_3390_s18093153
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon