CTNet: an efficient coupled transformer network for robust hyperspectral unmixing

This study introduces the coupled transformer Network (CTNet), an architecture designed to enhance the robustness and effectiveness of hyperspectral unmixing (HSU) tasks, addressing key limitations of traditional autoencoder (AE) frameworks. Traditional AEs, consisting of an encoder and a decoder, e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of remote sensing Jg. 45; H. 17; S. 5679 - 5712
Hauptverfasser: Meng, Fanlei, Sun, Haixin, Li, Jie, Xu, Tingfa
Format: Journal Article
Sprache:Englisch
Veröffentlicht: London Taylor & Francis 01.09.2024
Taylor & Francis Ltd
Schlagworte:
ISSN:0143-1161, 1366-5901, 1366-5901
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract This study introduces the coupled transformer Network (CTNet), an architecture designed to enhance the robustness and effectiveness of hyperspectral unmixing (HSU) tasks, addressing key limitations of traditional autoencoder (AE) frameworks. Traditional AEs, consisting of an encoder and a decoder, effectively learn and reconstruct low-dimensional abundance relationships from high-dimensional hyperspectral data but often struggle with spectral variability (SV) and spatial correlations, which can lead to uncertainty in the resulting abundance estimates. CTNet improves upon these limitations by incorporating a two-stream half-Siamese network with an additional encoder trained on pseudo-pure pixels, and further integrates a cross-attention module to leverage global information. This configuration not only guides the AE towards more accurate abundance estimates by directly addressing SV, but also enhances the network's ability to capture complex spectral information. To minimize the typical reconstruction errors associated with AEs, a transcription loss constraint is applied, which preserves essential details and material-related information often lost during pixel-level reconstruction. Experimental validation on synthetic and three widely-used datasets confirms that CTNet outperforms several state-of-the-art methods, providing a more robust and effective solution for HSU challenges.
AbstractList This study introduces the coupled transformer Network (CTNet), an architecture designed to enhance the robustness and effectiveness of hyperspectral unmixing (HSU) tasks, addressing key limitations of traditional autoencoder (AE) frameworks. Traditional AEs, consisting of an encoder and a decoder, effectively learn and reconstruct low-dimensional abundance relationships from high-dimensional hyperspectral data but often struggle with spectral variability (SV) and spatial correlations, which can lead to uncertainty in the resulting abundance estimates. CTNet improves upon these limitations by incorporating a two-stream half-Siamese network with an additional encoder trained on pseudo-pure pixels, and further integrates a cross-attention module to leverage global information. This configuration not only guides the AE towards more accurate abundance estimates by directly addressing SV, but also enhances the network's ability to capture complex spectral information. To minimize the typical reconstruction errors associated with AEs, a transcription loss constraint is applied, which preserves essential details and material-related information often lost during pixel-level reconstruction. Experimental validation on synthetic and three widely-used datasets confirms that CTNet outperforms several state-of-the-art methods, providing a more robust and effective solution for HSU challenges.
Author Xu, Tingfa
Sun, Haixin
Meng, Fanlei
Li, Jie
Author_xml – sequence: 1
  givenname: Fanlei
  orcidid: 0000-0002-1014-3019
  surname: Meng
  fullname: Meng, Fanlei
  organization: Changchun University
– sequence: 2
  givenname: Haixin
  orcidid: 0000-0002-6339-6589
  surname: Sun
  fullname: Sun, Haixin
  organization: Changchun University
– sequence: 3
  givenname: Jie
  surname: Li
  fullname: Li, Jie
  organization: Changchun University
– sequence: 4
  givenname: Tingfa
  surname: Xu
  fullname: Xu, Tingfa
  email: ciom_xtf1@bit.edu.cn
  organization: Beijing Institute of Technology
BookMark eNqFkE1r3DAQhkVIoZuPn1AQ9NKLN_qwZLu9tCxpEwgpgeQsZHmUKPFKriTT7L-PzCaXHNrTMMPzDjPPETr0wQNCnyhZU9KSM0JrTqmka0ZYvWa8KdP6AK0ol7ISHaGHaLUw1QJ9REcpPRJCZCOaFbrZ3F5D_oq1x2CtMw58xibM0wgDzlH7ZEPcQsQe8t8Qn3BpcQz9nDJ-2E0Q0wSmcCOe_dY9O39_gj5YPSY4fa3H6O7n-e3morr6_ety8-OqMuXAXAGjlDQwkMZK3VMxsI5ZYXTfd1ILa_teDtAOfGgbIckAvaDcshZ0x0wtNPBj9GW_d4rhzwwpq61LBsZRewhzUpwK3khZs6agn9-hj2GOvlynOOkK1lImCyX2lIkhpQhWTdFtddwpStQiWr2JVoto9Sq65L69yxmXdXbBFy9u_G_6-z7t_GJaF8njoLLejSHa4t-45ZV_rngBE_SZTg
CitedBy_id crossref_primary_10_1109_JSTARS_2025_3593531
crossref_primary_10_3390_rs17050869
Cites_doi 10.1109/MSP.2013.2279274
10.1109/TGRS.2011.2162098
10.1109/TIP.2016.2579259
10.1109/TGRS.2005.844293
10.1109/MGRS.2021.3071158
10.1109/ICCV48922.2021.00061
10.1109/TCI.2019.2948726
10.1109/79.974727
10.1109/TGRS.2012.2191590
10.1109/TGRS.2020.3041157
10.1109/IGARSS39084.2020.9324087
10.1109/JSTARS.2014.2375342
10.1109/TGRS.2017.2753847
10.1109/ICASSP.2011.5946577
10.1109/TGRS.2013.2240001
10.1109/IGARSS39084.2020.9324546
10.1109/36.911111
10.1109/TGRS.2010.2098414
10.1109/MGRS.2020.2979764
10.1109/MGRS.2021.3064051
10.1109/TPAMI.2012.120
10.1109/LGRS.2020.3011941
10.1109/IGARSS.2019.8900297
10.1109/ICASSP.2018.8462214
10.1109/ACCESS.2018.2818280
10.1109/WHISPERS.2010.5594963
10.1109/RAST.2013.6581194
10.1109/JSTARS.2014.2320576
10.1109/TGRS.2018.2861992
10.1109/TGRS.2021.3094884
10.1109/MNET.001.1900550
10.1109/ICIP.2017.8296278
10.1109/JSTARS.2012.2194696
10.1016/j.neucom.2017.11.052
10.1109/TNNLS.2021.3082289
10.1109/TCI.2023.3321985
10.1109/MNET.011.2000168
10.1109/TSP.2015.2486746
10.1109/TGRS.2008.2002882
10.1109/ICASSP39728.2021.9414810
10.1109/TGRS.2018.2868690
10.1109/WHISPERS.2015.8075378
10.1109/TGRS.2022.3196057
10.1109/TGRS.2011.2155070
10.1109/IGARSS.2019.8898410
10.1109/IGARSS.2019.8899865
10.1109/TGRS.2019.2907567
10.1109/TGRS.2021.3064958
10.1109/MSP.2013.2279177
10.1021/acs.joc.5b00892
ContentType Journal Article
Copyright 2024 Informa UK Limited, trading as Taylor & Francis Group 2024
2024 Informa UK Limited, trading as Taylor & Francis Group
Copyright_xml – notice: 2024 Informa UK Limited, trading as Taylor & Francis Group 2024
– notice: 2024 Informa UK Limited, trading as Taylor & Francis Group
DBID AAYXX
CITATION
7TG
7TN
8FD
F1W
FR3
H8D
H96
KL.
KR7
L.G
L7M
7S9
L.6
DOI 10.1080/01431161.2024.2371084
DatabaseName CrossRef
Meteorological & Geoastrophysical Abstracts
Oceanic Abstracts
Technology Research Database
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Aerospace Database
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Meteorological & Geoastrophysical Abstracts - Academic
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Advanced Technologies Database with Aerospace
AGRICOLA
AGRICOLA - Academic
DatabaseTitle CrossRef
Aerospace Database
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Meteorological & Geoastrophysical Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Oceanic Abstracts
Technology Research Database
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
Meteorological & Geoastrophysical Abstracts - Academic
AGRICOLA
AGRICOLA - Academic
DatabaseTitleList
Aerospace Database
AGRICOLA
DeliveryMethod fulltext_linktorsrc
Discipline Geography
EISSN 1366-5901
EndPage 5712
ExternalDocumentID 10_1080_01431161_2024_2371084
2371084
Genre Research Article
GrantInformation_xml – fundername: Education Department of Jilin Province
– fundername: National Natural Science Foundations of China
  grantid: 41730422
GroupedDBID -~X
.7F
.DC
.QJ
0BK
0R~
29J
30N
4.4
5GY
5VS
AAENE
AAGDL
AAHBH
AAHIA
AAJMT
AALDU
AAMIU
AAPUL
AAQRR
ABCCY
ABFIM
ABHAV
ABJNI
ABLIJ
ABLJU
ABPAQ
ABPEM
ABRLO
ABUFD
ABXUL
ABXYU
ACGEJ
ACGFS
ACIWK
ACTIO
ADCVX
ADGTB
ADXPE
AEISY
AENEX
AEOZL
AEPSL
AEXLP
AEYOC
AFKVX
AFRVT
AGDLA
AGMYJ
AHDZW
AIJEM
AIYEW
AJWEG
AKBVH
AKOOK
ALMA_UNASSIGNED_HOLDINGS
ALQZU
AQRUH
AQTUD
AVBZW
AWYRJ
BLEHA
CCCUG
CE4
CS3
DGEBU
DKSSO
DU5
EBS
E~A
E~B
F5P
H13
HF~
IPNFZ
J.P
KYCEM
LJTGL
M4Z
P2P
RIG
RNANH
ROSJB
RTWRZ
S-T
SNACF
TASJS
TBQAZ
TDBHL
TEN
TFL
TFT
TFW
TN5
TNC
TQWBC
TTHFI
TUROJ
TWF
UPT
UT5
UU3
ZGOLN
~02
~S~
07I
1TA
4B5
6TJ
AAYXX
ABDPE
ABFMO
ACFTK
ACTTO
ADXEU
AEHZU
AEZBV
AFBWG
AFION
AGBLW
AGVKY
AGWUF
AGYFW
AI.
AIDBO
AKHJE
AKMBP
ALRRR
ALXIB
BGSSV
BWMZZ
C0-
C5H
CAG
CITATION
COF
CYRSC
DAOYK
DEXXA
EJD
FETWF
H~9
IFELN
L8C
OPCYK
TAJZE
TAP
UB6
VH1
VOH
ZY4
7TG
7TN
8FD
F1W
FR3
H8D
H96
KL.
KR7
L.G
L7M
7S9
L.6
ID FETCH-LOGICAL-c371t-e21107ed07f6ab15d292f5cabb96a5ffbb6de8d3d87560deb513f28ea92c45ae3
IEDL.DBID TFW
ISICitedReferencesCount 2
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001281673400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0143-1161
1366-5901
IngestDate Thu Sep 04 16:25:02 EDT 2025
Wed Aug 13 09:13:15 EDT 2025
Tue Nov 18 22:31:17 EST 2025
Sat Nov 29 06:13:51 EST 2025
Mon Oct 20 23:47:49 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 17
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c371t-e21107ed07f6ab15d292f5cabb96a5ffbb6de8d3d87560deb513f28ea92c45ae3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-6339-6589
0000-0002-1014-3019
PQID 3095378126
PQPubID 2045515
PageCount 34
ParticipantIDs informaworld_taylorfrancis_310_1080_01431161_2024_2371084
proquest_miscellaneous_3153766427
proquest_journals_3095378126
crossref_primary_10_1080_01431161_2024_2371084
crossref_citationtrail_10_1080_01431161_2024_2371084
PublicationCentury 2000
PublicationDate 2024-09-01
PublicationDateYYYYMMDD 2024-09-01
PublicationDate_xml – month: 09
  year: 2024
  text: 2024-09-01
  day: 01
PublicationDecade 2020
PublicationPlace London
PublicationPlace_xml – name: London
PublicationTitle International journal of remote sensing
PublicationYear 2024
Publisher Taylor & Francis
Taylor & Francis Ltd
Publisher_xml – name: Taylor & Francis
– name: Taylor & Francis Ltd
References e_1_3_3_52_1
e_1_3_3_50_1
Vaswani A. (e_1_3_3_51_1) 2017
e_1_3_3_18_1
e_1_3_3_39_1
e_1_3_3_14_1
e_1_3_3_37_1
e_1_3_3_16_1
e_1_3_3_35_1
Kingma D. (e_1_3_3_38_1) 2014
e_1_3_3_10_1
e_1_3_3_33_1
e_1_3_3_56_1
e_1_3_3_12_1
e_1_3_3_31_1
e_1_3_3_54_1
e_1_3_3_40_1
Bhatt J. S. (e_1_3_3_6_1) 2018
e_1_3_3_7_1
e_1_3_3_9_1
e_1_3_3_29_1
e_1_3_3_25_1
e_1_3_3_48_1
e_1_3_3_27_1
e_1_3_3_46_1
e_1_3_3_3_1
e_1_3_3_21_1
e_1_3_3_44_1
e_1_3_3_5_1
e_1_3_3_23_1
e_1_3_3_42_1
e_1_3_3_30_1
Dosovitskiy A. (e_1_3_3_13_1) 2020
e_1_3_3_17_1
e_1_3_3_19_1
e_1_3_3_15_1
e_1_3_3_36_1
Ba J. (e_1_3_3_4_1) 2016
e_1_3_3_34_1
e_1_3_3_55_1
e_1_3_3_11_1
e_1_3_3_32_1
e_1_3_3_53_1
e_1_3_3_41_1
e_1_3_3_8_1
e_1_3_3_28_1
e_1_3_3_24_1
e_1_3_3_49_1
e_1_3_3_26_1
e_1_3_3_47_1
e_1_3_3_2_1
e_1_3_3_20_1
e_1_3_3_45_1
e_1_3_3_22_1
e_1_3_3_43_1
References_xml – ident: e_1_3_3_12_1
  doi: 10.1109/MSP.2013.2279274
– ident: e_1_3_3_39_1
  doi: 10.1109/TGRS.2011.2162098
– ident: e_1_3_3_14_1
  doi: 10.1109/TIP.2016.2579259
– ident: e_1_3_3_42_1
  doi: 10.1109/TGRS.2005.844293
– ident: e_1_3_3_10_1
  doi: 10.1109/MGRS.2021.3071158
– ident: e_1_3_3_52_1
  doi: 10.1109/ICCV48922.2021.00061
– ident: e_1_3_3_11_1
  doi: 10.1109/TCI.2019.2948726
– ident: e_1_3_3_37_1
  doi: 10.1109/79.974727
– ident: e_1_3_3_31_1
  doi: 10.1109/TGRS.2012.2191590
– ident: e_1_3_3_47_1
  doi: 10.1109/TGRS.2020.3041157
– ident: e_1_3_3_48_1
  doi: 10.1109/IGARSS39084.2020.9324087
– ident: e_1_3_3_23_1
  doi: 10.1109/JSTARS.2014.2375342
– ident: e_1_3_3_54_1
  doi: 10.1109/TGRS.2017.2753847
– ident: e_1_3_3_3_1
  doi: 10.1109/ICASSP.2011.5946577
– ident: e_1_3_3_30_1
  doi: 10.1109/TGRS.2013.2240001
– ident: e_1_3_3_5_1
  doi: 10.1109/IGARSS39084.2020.9324546
– ident: e_1_3_3_20_1
  doi: 10.1109/36.911111
– ident: e_1_3_3_18_1
  doi: 10.1109/TGRS.2010.2098414
– ident: e_1_3_3_46_1
  doi: 10.1109/MGRS.2020.2979764
– ident: e_1_3_3_26_1
  doi: 10.1109/MGRS.2021.3064051
– ident: e_1_3_3_2_1
  doi: 10.1109/TPAMI.2012.120
– ident: e_1_3_3_19_1
  doi: 10.1109/LGRS.2020.3011941
– ident: e_1_3_3_43_1
  doi: 10.1109/IGARSS.2019.8900297
– ident: e_1_3_3_28_1
  doi: 10.1109/ICASSP.2018.8462214
– volume-title: NIPS 2016 Deep Learning Symposium
  year: 2016
  ident: e_1_3_3_4_1
– ident: e_1_3_3_44_1
  doi: 10.1109/ACCESS.2018.2818280
– ident: e_1_3_3_7_1
  doi: 10.1109/WHISPERS.2010.5594963
– ident: e_1_3_3_49_1
  doi: 10.1109/RAST.2013.6581194
– ident: e_1_3_3_22_1
  doi: 10.1109/JSTARS.2014.2320576
– start-page: 6980
  volume-title: International Conference on Learning Representations (ICLR)
  year: 2014
  ident: e_1_3_3_38_1
– ident: e_1_3_3_33_1
  doi: 10.1109/TGRS.2018.2861992
– ident: e_1_3_3_36_1
  doi: 10.1109/TGRS.2021.3094884
– start-page: 1
  volume-title: 9th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)
  year: 2018
  ident: e_1_3_3_6_1
– ident: e_1_3_3_55_1
  doi: 10.1109/MNET.001.1900550
– ident: e_1_3_3_27_1
  doi: 10.1109/ICIP.2017.8296278
– ident: e_1_3_3_8_1
  doi: 10.1109/JSTARS.2012.2194696
– ident: e_1_3_3_41_1
  doi: 10.1016/j.neucom.2017.11.052
– ident: e_1_3_3_25_1
  doi: 10.1109/TNNLS.2021.3082289
– ident: e_1_3_3_9_1
  doi: 10.1109/TCI.2023.3321985
– ident: e_1_3_3_40_1
  doi: 10.1109/MNET.011.2000168
– ident: e_1_3_3_50_1
  doi: 10.1109/TSP.2015.2486746
– ident: e_1_3_3_32_1
  doi: 10.1109/TGRS.2008.2002882
– ident: e_1_3_3_35_1
  doi: 10.1109/ICASSP39728.2021.9414810
– ident: e_1_3_3_45_1
  doi: 10.1109/TGRS.2018.2868690
– ident: e_1_3_3_17_1
  doi: 10.1109/WHISPERS.2015.8075378
– ident: e_1_3_3_16_1
  doi: 10.1109/TGRS.2022.3196057
– ident: e_1_3_3_21_1
  doi: 10.1109/TGRS.2011.2155070
– ident: e_1_3_3_34_1
  doi: 10.1109/IGARSS.2019.8898410
– volume-title: International Conference on Learning Representations
  year: 2020
  ident: e_1_3_3_13_1
– ident: e_1_3_3_24_1
  doi: 10.1109/IGARSS.2019.8899865
– ident: e_1_3_3_53_1
  doi: 10.1109/TGRS.2019.2907567
– start-page: 6000
  volume-title: International Conference on Neural Information Processing Systems
  year: 2017
  ident: e_1_3_3_51_1
– ident: e_1_3_3_15_1
  doi: 10.1109/TGRS.2021.3064958
– ident: e_1_3_3_56_1
  doi: 10.1109/MSP.2013.2279177
– ident: e_1_3_3_29_1
  doi: 10.1021/acs.joc.5b00892
SSID ssj0006757
Score 2.454857
Snippet This study introduces the coupled transformer Network (CTNet), an architecture designed to enhance the robustness and effectiveness of hyperspectral unmixing...
SourceID proquest
crossref
informaworld
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 5679
SubjectTerms Abundance
Artificial neural networks
Coders
Coupled transformer network
cross-attention
data collection
Effectiveness
Estimates
image analysis
Pixels
Reconstruction
robust hyperspectral unmixing
Robustness (mathematics)
spectral variability
transcription loss
Transformers
uncertainty
Title CTNet: an efficient coupled transformer network for robust hyperspectral unmixing
URI https://www.tandfonline.com/doi/abs/10.1080/01431161.2024.2371084
https://www.proquest.com/docview/3095378126
https://www.proquest.com/docview/3153766427
Volume 45
WOSCitedRecordID wos001281673400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAWR
  databaseName: Taylor & Francis Online Journals
  customDbUrl:
  eissn: 1366-5901
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0006757
  issn: 0143-1161
  databaseCode: TFW
  dateStart: 19800101
  isFulltext: true
  titleUrlDefault: https://www.tandfonline.com
  providerName: Taylor & Francis
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1JS8NAFB6kCHpxF-vGCF5Tm5nJ5k3E4kGKQl1uYVZaqElJE9F_75tkUhSRHvQYwpsZ3izf95I330PoPFRCAkoFXkgN8xhQZI_D0vAMV1JZgSVVyxc_3UXDYfzykty7bMK5S6u0MbRphCLqs9pubi7mbUbchZWk84GpQHRHWI9QAMnYKoICs7dJfaPB8-IsBjrcXJi2Qpxg0t7h-a2Vb-j0Tbv0x1ldA9Bg8x-GvoU2HPvEV81y2UYrOttBa64Q-vhjFz1cj4a6vMQ8w7oWlwBMwjKvZlOtcNmSXF3grEkfx_CIi1xU8xKPIaRtbm4W0EeVvU7eARf30OPgZnR967mqC56E4ZSerkNCrfqRCbnwA0USYgLJhUhCHhgjRKh0rKiCSCfsKy0CnxoSa54QyQKu6T7qZHmmDxCOJDE-CQ3jPmVcypgBYPZNJGgsCVWsi1jr7VQ6SXJbGWOa-q1yqfNXav2VOn91UW9hNms0OZYZJF-nMi3rjyGmqVyS0iW2x-28p257g4lV6YuAG4VddLZ4DRvT_m3hmc4r26xVyoHwLjr8Q_dHaN0-Nlltx6hTFpU-QavyrZzMi9N6sX8CfpH5oA
linkProvider Taylor & Francis
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1ZS8QwEB68QF-8xfWM4Gt1m6SXbyIuiuuCsB5vJScK2pXaiv57Jz0WRcQHfSxhkjA5Zr505huA_VBLhVYq8EJmucfRRfYEbg3PCq20I1jSFX3xTT8aDOK7u-RzLowLq3QY2tZEEdVd7Q63e4xuQ-IOHSedj64KwjvKDyhDKxnzSZh21ekcABv2bse3MTrEdcq0o-JEmTaL56duvtinL-yl327rygT1Fv5j8osw3zig5LjeMUswYbJlmG1qod-_r8DVyXBgiiMiMmIqfgk0S0SNyudHo0nR-rkmJ1kdQU7wk-QjWb4U5B5RbZ28meMYZfb08IamcRWue6fDkzOvKbzgKZxO4ZkKFRrdjWwopB9omlAbKCFlEorAWilDbWLNNIKdsKuNDHxmaWxEQhUPhGFrMJWNMrMOJFLU-jS0XPiMC6VijjazayPJYkWZ5h3grbpT1bCSu-IYj6nfkpc2-kqdvtJGXx04GIs917Qcvwkkn9cyLar3EFsXL0nZL7Jb7cKnzQlHEUfUF6F7FHZgb9yMZ9P9cBGZGZWuW0eWgwgv2vjD8Lsweza87Kf988HFJsy5pjrIbQumirw02zCjXouHl3yn2vkfQXb9ww
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT9wwEB4BrSiXllfVBQpG6jWwsZ0XN0S7AoFWIG0Lt8hPLRLNrrIJov-eceKsQAhxoMfIGtsajz3fxONvAH7EWir0UlEQM8sDjhA5EGgagRVaaUewpBv64j8XyXCY3txklz6bcObTKl0MbVuiiOasdpt7qm2XEXfoKOlCRCoY3VF-QBk6yZQvwgeEzpEz7NHgen4YIx5uX0w7Jk6U6R7xvNbNM_f0jLz0xWHdeKDBl_8w91X47OEnOW7tZQ0WTLEOn3wl9PG_Dbg6GQ1NdUREQUzDLoFOiahJPb0zmlQdyjUlKdr8cYKfpJzIelaRMca07dPNEseoi7-3D-gYN-H34Nfo5DTwZRcChdOpAtPEhEb3ExsLGUaaZtRGSkiZxSKyVspYm1QzjaFO3NdGRiGzNDUio4pHwrCvsFRMCvMNSKKoDWlsuQgZF0qlHD1m3yaSpYoyzXvAO23nynOSu9IYd3nYUZd6feVOX7nXVw8O5mLTlpTjLYHs6VLmVfM3xLalS3L2huxOt-65398o4mj6EgRHcQ_25824M911iyjMpHbdOqocjO-SrXcMvwfLlz8H-cXZ8HwbVlxLm-G2A0tVWZvv8FHdV7ezcrex-0c9cPx1
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=CTNet%3A+an+efficient+coupled+transformer+network+for+robust+hyperspectral+unmixing&rft.jtitle=International+journal+of+remote+sensing&rft.au=Meng%2C+Fanlei&rft.au=Sun%2C+Haixin&rft.au=Li%2C+Jie&rft.au=Xu%2C+Tingfa&rft.date=2024-09-01&rft.issn=1366-5901&rft.volume=45&rft.issue=17+p.5679-5712&rft.spage=5679&rft.epage=5712&rft_id=info:doi/10.1080%2F01431161.2024.2371084&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0143-1161&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0143-1161&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0143-1161&client=summon