Unsupervised Deep Slow Feature Analysis for Change Detection in Multi-Temporal Remote Sensing Images
Change detection has been a hotspot in the remote sensing technology for a long time. With the increasing availability of multi-temporal remote sensing images, numerous change detection algorithms have been proposed. Among these methods, image transformation methods with feature extraction and mappi...
Gespeichert in:
| Veröffentlicht in: | IEEE transactions on geoscience and remote sensing Jg. 57; H. 12; S. 9976 - 9992 |
|---|---|
| Hauptverfasser: | , , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
New York
IEEE
01.12.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Schlagworte: | |
| ISSN: | 0196-2892, 1558-0644 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Change detection has been a hotspot in the remote sensing technology for a long time. With the increasing availability of multi-temporal remote sensing images, numerous change detection algorithms have been proposed. Among these methods, image transformation methods with feature extraction and mapping could effectively highlight the changed information and thus has a better change detection performance. However, the changes of multi-temporal images are usually complex, and the existing methods are not effective enough. In recent years, the deep network has shown its brilliant performance in many fields, including feature extraction and projection. Therefore, in this paper, based on the deep network and slow feature analysis (SFA) theory, we proposed a new change detection algorithm for multi-temporal remotes sensing images called deep SFA (DSFA). In the DSFA model, two symmetric deep networks are utilized for projecting the input data of bi-temporal imagery. Then, the SFA module is deployed to suppress the unchanged components and highlight the changed components of the transformed features. The change vector analysis pre-detection is employed to find unchanged pixels with high confidence as training samples. Finally, the change intensity is calculated with chi-square distance and the changes are determined by threshold algorithms. The experiments are performed on two real-world data sets and a public hyperspectral data set. The visual comparison and the quantitative evaluation have shown that DSFA could outperform the other state-of-the-art algorithms, including other SFA-based and deep learning methods. |
|---|---|
| AbstractList | Change detection has been a hotspot in the remote sensing technology for a long time. With the increasing availability of multi-temporal remote sensing images, numerous change detection algorithms have been proposed. Among these methods, image transformation methods with feature extraction and mapping could effectively highlight the changed information and thus has a better change detection performance. However, the changes of multi-temporal images are usually complex, and the existing methods are not effective enough. In recent years, the deep network has shown its brilliant performance in many fields, including feature extraction and projection. Therefore, in this paper, based on the deep network and slow feature analysis (SFA) theory, we proposed a new change detection algorithm for multi-temporal remotes sensing images called deep SFA (DSFA). In the DSFA model, two symmetric deep networks are utilized for projecting the input data of bi-temporal imagery. Then, the SFA module is deployed to suppress the unchanged components and highlight the changed components of the transformed features. The change vector analysis pre-detection is employed to find unchanged pixels with high confidence as training samples. Finally, the change intensity is calculated with chi-square distance and the changes are determined by threshold algorithms. The experiments are performed on two real-world data sets and a public hyperspectral data set. The visual comparison and the quantitative evaluation have shown that DSFA could outperform the other state-of-the-art algorithms, including other SFA-based and deep learning methods. |
| Author | Ru, Lixiang Zhang, Liangpei Du, Bo Wu, Chen |
| Author_xml | – sequence: 1 givenname: Bo orcidid: 0000-0002-0059-8458 surname: Du fullname: Du, Bo email: gunspace@163.com organization: School of Computer Science, Wuhan University, Wuhan, China – sequence: 2 givenname: Lixiang orcidid: 0000-0002-9129-2453 surname: Ru fullname: Ru, Lixiang email: rulixiang@whu.edu.cn organization: School of Computer Science, Wuhan University, Wuhan, China – sequence: 3 givenname: Chen orcidid: 0000-0001-6461-8377 surname: Wu fullname: Wu, Chen email: chen.wu@whu.edu.cn organization: School of Computer Science, Wuhan University, Wuhan, China – sequence: 4 givenname: Liangpei orcidid: 0000-0001-6890-3650 surname: Zhang fullname: Zhang, Liangpei email: zlp62@whu.edu.cn organization: School of Computer Science, Wuhan University, Wuhan, China |
| BookMark | eNp9kEtLxDAUhYMoOD5-gLgJuO6Ym77SpYyODijCPNYlk96OkTapSarMv7dlxIULV2dxv-_COWfk2FiDhFwBmwKw4nb9uFxNOYNiyouYZYIfkQmkqYhYliTHZDJcsoiLgp-SM-_fGYMkhXxCqo3xfYfuU3us6D1iR1eN_aJzlKF3SO-MbPZee1pbR2dv0uxwoAKqoK2h2tCXvgk6WmPbWScbusTWBqQrNF6bHV20cof-gpzUsvF4-ZPnZDN_WM-eoufXx8Xs7jlScVqESDHYyqSuhIK4EoA5ilRBxnO1rVJV1IwLrFlWQc6qNFUZiHqIClVdbFnMRHxObg5_O2c_evShfLe9Gxr4kseQ5zyPCxio_EApZ713WJdKBzn2CU7qpgRWjpOW46TlOGn5M-lgwh-zc7qVbv-vc31wNCL-8kLwhEMWfwOMDYTB |
| CODEN | IGRSD2 |
| CitedBy_id | crossref_primary_10_1109_TIM_2024_3353267 crossref_primary_10_1109_LGRS_2024_3434730 crossref_primary_10_1109_LGRS_2021_3096526 crossref_primary_10_1109_TGRS_2024_3403971 crossref_primary_10_1007_s10489_024_06000_0 crossref_primary_10_1016_j_knosys_2025_113408 crossref_primary_10_1016_j_isprsjprs_2020_01_005 crossref_primary_10_1016_j_isprsjprs_2024_01_016 crossref_primary_10_1109_JSTARS_2025_3541260 crossref_primary_10_1109_TGRS_2022_3156041 crossref_primary_10_1109_TGRS_2024_3390206 crossref_primary_10_1109_JSTARS_2024_3401581 crossref_primary_10_3390_rs14122834 crossref_primary_10_1016_j_eswa_2023_122125 crossref_primary_10_1016_j_oceaneng_2023_113835 crossref_primary_10_1080_01431161_2022_2155084 crossref_primary_10_1109_LGRS_2021_3059461 crossref_primary_10_1109_TGRS_2021_3139077 crossref_primary_10_3390_rs17152624 crossref_primary_10_1109_TIP_2025_3542276 crossref_primary_10_3390_rs14122838 crossref_primary_10_1016_j_compag_2025_110770 crossref_primary_10_1007_s00521_022_07928_5 crossref_primary_10_1109_TGRS_2022_3224293 crossref_primary_10_1109_TGRS_2022_3220814 crossref_primary_10_1109_TGRS_2023_3235917 crossref_primary_10_1007_s10619_020_07292_0 crossref_primary_10_1016_j_isprsjprs_2023_11_004 crossref_primary_10_1080_13682199_2024_2410673 crossref_primary_10_3390_rs15051194 crossref_primary_10_1109_JSTARS_2022_3146167 crossref_primary_10_1109_TMM_2020_2999182 crossref_primary_10_1109_TCSVT_2025_3526960 crossref_primary_10_3390_rs11232740 crossref_primary_10_1109_TGRS_2019_2956756 crossref_primary_10_3390_rs13152969 crossref_primary_10_1109_TGRS_2025_3598588 crossref_primary_10_1109_TGRS_2024_3363886 crossref_primary_10_1109_TGRS_2024_3381751 crossref_primary_10_3390_rs15020428 crossref_primary_10_1109_JSTARS_2023_3270498 crossref_primary_10_1109_TGRS_2021_3054641 crossref_primary_10_1109_TII_2024_3367045 crossref_primary_10_1109_TGRS_2022_3221492 crossref_primary_10_1007_s11629_020_6151_y crossref_primary_10_1007_s12145_021_00620_7 crossref_primary_10_1109_TGRS_2021_3053571 crossref_primary_10_1109_LGRS_2025_3569718 crossref_primary_10_1049_iet_ipr_2019_1527 crossref_primary_10_1109_TNNLS_2022_3201621 crossref_primary_10_1109_TGRS_2024_3445930 crossref_primary_10_1109_ACCESS_2020_3041790 crossref_primary_10_1117_1_JEI_31_5_053005 crossref_primary_10_3390_electronics11091486 crossref_primary_10_1109_JSTARS_2021_3066508 crossref_primary_10_1016_j_isprsjprs_2024_09_002 crossref_primary_10_1109_TGRS_2021_3090802 crossref_primary_10_1016_j_ins_2020_06_011 crossref_primary_10_1016_j_rse_2025_114912 crossref_primary_10_1109_JSTARS_2022_3166234 crossref_primary_10_1080_10106049_2021_2017018 crossref_primary_10_1109_JSTARS_2025_3588154 crossref_primary_10_3390_rs13244971 crossref_primary_10_1109_TGRS_2023_3336791 crossref_primary_10_1109_TIM_2021_3096872 crossref_primary_10_1016_j_neucom_2020_04_033 crossref_primary_10_1016_j_psep_2025_106941 crossref_primary_10_1061__ASCE_NH_1527_6996_0000500 crossref_primary_10_1109_TGRS_2020_2992542 crossref_primary_10_1109_JSTARS_2022_3150571 crossref_primary_10_1016_j_knosys_2025_113691 crossref_primary_10_1111_phor_12492 crossref_primary_10_1109_JSTARS_2024_3522350 crossref_primary_10_3390_rs12111781 crossref_primary_10_1109_TGRS_2020_2987338 crossref_primary_10_1109_ACCESS_2020_2968965 crossref_primary_10_1109_JSTARS_2022_3157648 crossref_primary_10_1080_2150704X_2021_1895448 crossref_primary_10_3390_rs13234918 crossref_primary_10_1109_JSTARS_2020_3046482 crossref_primary_10_1109_TIP_2020_3028457 crossref_primary_10_3390_rs13234927 crossref_primary_10_3390_rs14143323 crossref_primary_10_1109_TGRS_2022_3171067 crossref_primary_10_3390_app12105158 crossref_primary_10_3390_rs13234802 crossref_primary_10_1080_10095020_2022_2128902 crossref_primary_10_3390_rs13163125 crossref_primary_10_1109_ACCESS_2025_3560591 crossref_primary_10_1109_TGRS_2022_3191815 crossref_primary_10_1109_TGRS_2025_3534881 crossref_primary_10_3390_rs16183507 crossref_primary_10_1109_MGRS_2024_3412770 crossref_primary_10_1002_gj_4677 crossref_primary_10_1109_LGRS_2023_3333354 crossref_primary_10_1109_TGRS_2021_3079907 crossref_primary_10_1007_s00521_021_06092_6 crossref_primary_10_1109_TGRS_2021_3079909 crossref_primary_10_3390_rs16081357 crossref_primary_10_1016_j_neucom_2020_07_020 crossref_primary_10_3390_rs13214302 crossref_primary_10_1016_j_jprocont_2025_103461 crossref_primary_10_1109_ACCESS_2023_3268059 crossref_primary_10_1109_TGRS_2020_3037249 crossref_primary_10_3390_rs13071236 crossref_primary_10_1016_j_knosys_2024_111777 crossref_primary_10_1109_JSTARS_2020_3047677 crossref_primary_10_1007_s00521_022_07637_z crossref_primary_10_1109_LGRS_2024_3454629 crossref_primary_10_1109_TGRS_2022_3197334 crossref_primary_10_1080_01431161_2023_2224100 crossref_primary_10_1109_JSTARS_2020_2995445 crossref_primary_10_1109_TGRS_2024_3469930 crossref_primary_10_1109_JSTARS_2021_3086139 crossref_primary_10_1109_TGRS_2023_3321752 crossref_primary_10_1109_TGRS_2024_3353383 crossref_primary_10_1109_JSTARS_2021_3088438 crossref_primary_10_1109_TGRS_2025_3564996 crossref_primary_10_1109_TGRS_2020_3009483 crossref_primary_10_1109_TGRS_2023_3241097 crossref_primary_10_1109_TGRS_2020_3042507 crossref_primary_10_1109_LGRS_2023_3266091 crossref_primary_10_1007_s11432_020_3084_1 crossref_primary_10_1109_JSTARS_2023_3335281 crossref_primary_10_1016_j_eswa_2021_115663 crossref_primary_10_1016_j_wasman_2022_02_031 crossref_primary_10_1109_TGRS_2022_3160827 crossref_primary_10_1080_07038992_2021_1925530 crossref_primary_10_3390_rs15071770 crossref_primary_10_1109_ACCESS_2023_3333360 crossref_primary_10_1109_JSTARS_2022_3216624 crossref_primary_10_3390_rs12152460 crossref_primary_10_1007_s10661_024_12598_y crossref_primary_10_1109_TGRS_2020_3008286 crossref_primary_10_3390_rs16050804 crossref_primary_10_1109_LGRS_2022_3157916 crossref_primary_10_1109_TGRS_2024_3519195 crossref_primary_10_3390_rs15010246 crossref_primary_10_1016_j_neucom_2021_08_130 crossref_primary_10_1109_ACCESS_2020_3047915 crossref_primary_10_1109_LGRS_2022_3179134 crossref_primary_10_1109_TGRS_2021_3106697 crossref_primary_10_1017_qrd_2022_23 crossref_primary_10_1109_TGRS_2023_3335134 crossref_primary_10_1109_JSTARS_2024_3389641 crossref_primary_10_3390_rs15112834 crossref_primary_10_1109_TGRS_2021_3104032 crossref_primary_10_3390_rs15092464 crossref_primary_10_1109_TGRS_2023_3263563 crossref_primary_10_3390_rs15071868 crossref_primary_10_1007_s00521_020_05592_1 crossref_primary_10_1109_TGRS_2022_3206804 crossref_primary_10_1109_TGRS_2020_3045799 crossref_primary_10_1016_j_jag_2025_104792 crossref_primary_10_1109_TGRS_2023_3314217 crossref_primary_10_1109_TGRS_2023_3262928 crossref_primary_10_3390_rs14041000 crossref_primary_10_1080_01431161_2023_2224099 crossref_primary_10_1109_TGRS_2024_3442156 crossref_primary_10_1016_j_isprsjprs_2021_12_005 crossref_primary_10_1016_j_jag_2024_103663 crossref_primary_10_1080_01431161_2021_1937372 crossref_primary_10_1002_cjce_24207 crossref_primary_10_1109_TGRS_2022_3182745 crossref_primary_10_1109_JSTARS_2024_3349775 crossref_primary_10_1016_j_neucom_2021_01_031 crossref_primary_10_3390_rs14092228 crossref_primary_10_1016_j_ophoto_2023_100044 crossref_primary_10_1109_TGRS_2020_3033009 crossref_primary_10_1109_TGRS_2021_3064606 crossref_primary_10_1016_j_rse_2021_112589 crossref_primary_10_1109_JSTARS_2021_3108777 crossref_primary_10_1016_j_eswa_2020_114527 crossref_primary_10_1117_1_JRS_19_026503 crossref_primary_10_3390_rs16163037 crossref_primary_10_1016_j_jprocont_2025_103491 crossref_primary_10_1016_j_neucom_2024_127489 crossref_primary_10_1109_TGRS_2021_3089453 crossref_primary_10_1109_LGRS_2021_3111040 crossref_primary_10_1109_TCYB_2021_3086884 crossref_primary_10_1109_TGRS_2022_3203314 crossref_primary_10_1109_TGRS_2025_3586102 crossref_primary_10_1109_TGRS_2025_3591814 crossref_primary_10_3390_rs12233907 crossref_primary_10_1109_TGRS_2024_3476992 crossref_primary_10_1016_j_inffus_2024_102313 crossref_primary_10_3390_rs14246291 crossref_primary_10_1109_LGRS_2025_3569426 crossref_primary_10_1109_TNNLS_2021_3056238 crossref_primary_10_1016_j_asoc_2022_109130 crossref_primary_10_1109_JSTARS_2023_3268601 crossref_primary_10_1109_ACCESS_2021_3068162 crossref_primary_10_1016_j_jag_2022_102749 crossref_primary_10_1109_TGRS_2022_3157721 crossref_primary_10_1155_2022_8513093 crossref_primary_10_1109_LGRS_2021_3085022 crossref_primary_10_1109_TGRS_2020_3013673 crossref_primary_10_1109_TSMC_2020_3048950 crossref_primary_10_3390_s25092813 crossref_primary_10_1109_TGRS_2020_3043766 crossref_primary_10_1109_LGRS_2023_3302469 crossref_primary_10_1109_TGRS_2025_3556237 crossref_primary_10_1109_TGRS_2022_3229027 crossref_primary_10_1109_TCSVT_2023_3276719 crossref_primary_10_1109_TGRS_2022_3200985 crossref_primary_10_1016_j_asoc_2020_106510 crossref_primary_10_1080_17538947_2024_2316109 crossref_primary_10_3390_rs14133100 crossref_primary_10_1109_TGRS_2024_3424532 crossref_primary_10_1109_TGRS_2021_3067096 crossref_primary_10_1117_1_JRS_18_024516 crossref_primary_10_1007_s10666_021_09758_6 crossref_primary_10_1109_TGRS_2021_3106381 crossref_primary_10_1109_TGRS_2024_3512548 crossref_primary_10_1016_j_inffus_2025_103257 crossref_primary_10_3390_su141912597 crossref_primary_10_3724_j_issn_1007_2802_20240157 crossref_primary_10_1109_TGRS_2023_3305334 crossref_primary_10_1109_TPAMI_2023_3237896 crossref_primary_10_1109_JSTARS_2023_3285389 crossref_primary_10_1109_TGRS_2023_3274781 crossref_primary_10_3390_rs15215106 crossref_primary_10_1007_s40031_024_01084_1 crossref_primary_10_1016_j_optlastec_2022_109020 crossref_primary_10_1109_TNNLS_2021_3071026 crossref_primary_10_3390_rs12101688 crossref_primary_10_1109_TGRS_2023_3319961 crossref_primary_10_3390_rs14194806 crossref_primary_10_3390_su15010140 crossref_primary_10_1109_JSTARS_2023_3251962 |
| Cites_doi | 10.1109/TGRS.2013.2266673 10.1126/science.1127647 10.1109/TGRS.2008.916643 10.1016/S0034-4257(97)00112-0 10.1016/j.rse.2008.07.018 10.1109/TGRS.2012.2197860 10.1080/0143116031000101675 10.1109/TPAMI.2011.157 10.1109/TPAMI.2011.272 10.1016/j.rse.2007.06.013 10.1109/CVPR.2014.336 10.1109/TGRS.2016.2642125 10.1016/j.rse.2007.08.012 10.1162/0899766042321814 10.1016/j.rse.2010.02.018 10.1109/CVPR.2015.7298965 10.1016/j.rse.2017.07.009 10.1016/j.neucom.2012.08.056 10.1080/01431168908903939 10.1109/LGRS.2016.2611001 10.3390/rs10070980 10.1109/LGRS.2018.2876616 10.1080/01431160801950162 10.1080/0143116031000139863 10.1016/j.rse.2007.10.002 10.1080/02757259609532305 10.1162/089976602317318938 10.4249/scholarpedia.5282 10.1080/01431160903475399 10.1109/TIP.2006.888195 10.1007/978-3-540-87536-9_98 10.1109/IGARSS.2009.5418265 10.1016/j.rse.2009.02.004 10.1109/TGRS.2011.2141999 10.1109/TGRS.2011.2168230 10.1162/NECO_a_00171 10.1016/S0034-4257(97)00162-4 10.1109/TGRS.2011.2168534 10.1109/TIP.2015.2475625 10.1109/TGRS.2010.2066979 10.1109/36.843009 10.1016/0034-4257(95)00233-2 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
| DBID | 97E RIA RIE AAYXX CITATION 7UA 8FD C1K F1W FR3 H8D H96 KR7 L.G L7M |
| DOI | 10.1109/TGRS.2019.2930682 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Xplore CrossRef Water Resources Abstracts Technology Research Database Environmental Sciences and Pollution Management ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Aerospace Database Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Advanced Technologies Database with Aerospace |
| DatabaseTitle | CrossRef Aerospace Database Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Technology Research Database ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Advanced Technologies Database with Aerospace Water Resources Abstracts Environmental Sciences and Pollution Management |
| DatabaseTitleList | Aerospace Database |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Physics |
| EISSN | 1558-0644 |
| EndPage | 9992 |
| ExternalDocumentID | 10_1109_TGRS_2019_2930682 8824216 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: National Basic Research Program of China (973 Program); National Key Research and Development Program grantid: 2017YFC1502505 funderid: 10.13039/501100012166 – fundername: National Key R&D Program of China grantid: 2018YFA0605500 – fundername: Natural Science Foundation of Hubei Province grantid: 2018CFA050 funderid: 10.13039/501100003819 – fundername: National Natural Science Foundation of China grantid: 61601333; 61822113; 41801285; 41871243 funderid: 10.13039/501100001809 |
| GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AETIX AFRAH AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 Y6R AAYXX CITATION 7UA 8FD C1K F1W FR3 H8D H96 KR7 L.G L7M |
| ID | FETCH-LOGICAL-c359t-c01ba4fd8c13d81e7e85c1627cbd5c9f028ef06d170d55c618f55cdecf9b03083 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 279 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000505701800036&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0196-2892 |
| IngestDate | Mon Sep 29 16:40:40 EDT 2025 Tue Nov 18 22:24:33 EST 2025 Sat Nov 29 02:50:01 EST 2025 Wed Aug 27 02:43:05 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 12 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c359t-c01ba4fd8c13d81e7e85c1627cbd5c9f028ef06d170d55c618f55cdecf9b03083 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0001-6461-8377 0000-0001-6890-3650 0000-0002-9129-2453 0000-0002-0059-8458 |
| PQID | 2317727391 |
| PQPubID | 85465 |
| PageCount | 17 |
| ParticipantIDs | crossref_citationtrail_10_1109_TGRS_2019_2930682 crossref_primary_10_1109_TGRS_2019_2930682 proquest_journals_2317727391 ieee_primary_8824216 |
| PublicationCentury | 2000 |
| PublicationDate | 2019-12-01 |
| PublicationDateYYYYMMDD | 2019-12-01 |
| PublicationDate_xml | – month: 12 year: 2019 text: 2019-12-01 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | New York |
| PublicationPlace_xml | – name: New York |
| PublicationTitle | IEEE transactions on geoscience and remote sensing |
| PublicationTitleAbbrev | TGRS |
| PublicationYear | 2019 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref12 ref14 ref11 ren (ref40) 2015 ref10 nielsen (ref15) 1997 ref17 ref16 ref19 ref18 ref46 ref45 ref48 ref47 ref41 ref44 ref49 ref8 ref7 ref9 ref3 ref6 ref5 du (ref4) 2012; 14 ref35 ref34 ref36 ref31 petersen (ref43) 2012 ref30 ref33 andrew (ref42) 2013 ref32 krizhevsky (ref39) 2012 ref2 ref1 bengio (ref37) 2007 ref24 ref23 ref26 ref25 ref20 ref22 ref21 hinton (ref38) 2006; 313 ref28 ref27 ref29 |
| References_xml | – ident: ref28 doi: 10.1109/TGRS.2013.2266673 – year: 1997 ident: ref15 article-title: Multivariate alteration detection (MAD) in multispectral, bi-temporal image data: A new approach to change detection studies – volume: 313 start-page: 504 year: 2006 ident: ref38 article-title: Reducing the dimensionality of data with neural networks publication-title: Science doi: 10.1126/science.1127647 – ident: ref18 doi: 10.1109/TGRS.2008.916643 – ident: ref11 doi: 10.1016/S0034-4257(97)00112-0 – ident: ref8 doi: 10.1016/j.rse.2008.07.018 – ident: ref3 doi: 10.1109/TGRS.2012.2197860 – ident: ref26 doi: 10.1080/0143116031000101675 – ident: ref32 doi: 10.1109/TPAMI.2011.157 – start-page: 153 year: 2007 ident: ref37 article-title: Greedy layer-wise training of deep networks publication-title: Proc Adv Neural Inf Process Syst – ident: ref49 doi: 10.1109/TPAMI.2011.272 – ident: ref9 doi: 10.1016/j.rse.2007.06.013 – ident: ref33 doi: 10.1109/CVPR.2014.336 – ident: ref36 doi: 10.1109/TGRS.2016.2642125 – ident: ref21 doi: 10.1016/j.rse.2007.08.012 – ident: ref47 doi: 10.1162/0899766042321814 – ident: ref6 doi: 10.1016/j.rse.2010.02.018 – ident: ref41 doi: 10.1109/CVPR.2015.7298965 – start-page: 91 year: 2015 ident: ref40 article-title: Faster R-CNN: Towards real-time object detection with region proposal networks publication-title: Proc Adv Neural Inf Process Syst – ident: ref29 doi: 10.1016/j.rse.2017.07.009 – ident: ref2 doi: 10.1016/j.neucom.2012.08.056 – ident: ref1 doi: 10.1080/01431168908903939 – ident: ref45 doi: 10.1109/LGRS.2016.2611001 – ident: ref10 doi: 10.3390/rs10070980 – ident: ref46 doi: 10.1109/LGRS.2018.2876616 – ident: ref13 doi: 10.1080/01431160801950162 – ident: ref27 doi: 10.1080/0143116031000139863 – ident: ref23 doi: 10.1016/j.rse.2007.10.002 – ident: ref7 doi: 10.1080/02757259609532305 – ident: ref30 doi: 10.1162/089976602317318938 – ident: ref31 doi: 10.4249/scholarpedia.5282 – ident: ref19 doi: 10.1080/01431160903475399 – ident: ref44 doi: 10.1109/TIP.2006.888195 – ident: ref34 doi: 10.1007/978-3-540-87536-9_98 – volume: 14 start-page: 272 year: 2012 ident: ref4 article-title: A discriminative manifold learning based dimension reduction method for hyperspectral classification publication-title: Int J Fuzzy Syst – ident: ref17 doi: 10.1109/IGARSS.2009.5418265 – ident: ref5 doi: 10.1016/j.rse.2009.02.004 – start-page: 1247 year: 2013 ident: ref42 article-title: Deep canonical correlation analysis publication-title: Proc Int Conf Mach Learn – ident: ref25 doi: 10.1109/TGRS.2011.2141999 – year: 2012 ident: ref43 publication-title: The Matrix Cookbook Version 20121115 – start-page: 1097 year: 2012 ident: ref39 article-title: ImageNet classification with deep convolutional neural networks publication-title: Proc Adv Neural Inf Process Syst – ident: ref24 doi: 10.1109/TGRS.2011.2168230 – ident: ref35 doi: 10.1162/NECO_a_00171 – ident: ref14 doi: 10.1016/S0034-4257(97)00162-4 – ident: ref20 doi: 10.1109/TGRS.2011.2168534 – ident: ref48 doi: 10.1109/TIP.2015.2475625 – ident: ref22 doi: 10.1109/TGRS.2010.2066979 – ident: ref12 doi: 10.1109/36.843009 – ident: ref16 doi: 10.1016/0034-4257(95)00233-2 |
| SSID | ssj0014517 |
| Score | 2.689005 |
| Snippet | Change detection has been a hotspot in the remote sensing technology for a long time. With the increasing availability of multi-temporal remote sensing images,... |
| SourceID | proquest crossref ieee |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 9976 |
| SubjectTerms | Algorithms Analysis Artificial neural networks Change detection Change detection algorithms Components Datasets Deep learning deep network Detection Detection algorithms Eigenvalues and eigenfunctions Feature extraction Image detection Imagery Machine learning Mapping Methods Remote sensing remote sensing images slow feature analysis (SFA) Training Vector analysis |
| Title | Unsupervised Deep Slow Feature Analysis for Change Detection in Multi-Temporal Remote Sensing Images |
| URI | https://ieeexplore.ieee.org/document/8824216 https://www.proquest.com/docview/2317727391 |
| Volume | 57 |
| WOSCitedRecordID | wos000505701800036&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1558-0644 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014517 issn: 0196-2892 databaseCode: RIE dateStart: 19800101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NT9wwEB0BolJ7AApFLKXIh56qGuzEie1jBaXlgip2kbhF6_GkQlqyq81u-fvYTnbVqqhST8nBjqK8ZD4yb94AfDSeJNa54wU5y5XyGXe2FrzWztlxhpQhpmET-ubG3N_bHxvwed0LQ0SJfEZn8TTV8v0Ul_FX2XmIBlUmy03Y1Fp3vVrrioEqZN8aXfKQRGR9BVMKez76djuMJC57FnybKE32hw9KQ1X-ssTJvVzt_t-N7cFOH0ayLx3ub2GDmn1485u44D68SuRObA_A3zXtchaNQkueXRLN2HAyfWIx_FvOia2ESVgIYFnXbhBWLRJJq2EPDUtdunzUqVhN2C0FfIkNI_e9-cmuH4NNat_B3dXX0cV33k9X4JgXdsFRSDdWtTcoc28kaTIFyjLT6HyBAa_MUC1KL7XwRYGlNHU4eMLauihykx_CVjNt6AiYGltnXG2810rFBIaMQ507WWhpSYkBiNXzrrCXHo8TMCZVSkGErSJEVYSo6iEawKf1llmnu_GvxQcRk_XCHo4BnKxArfovs61CPKtjzGbl8cu73sPreO2OsnICW4v5kj7ANv5aPLTz0_TSPQOKktWV |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB6VAoIeeLQgtg_qAyeEWzuxY_tYlb7UskLdrdRbtLYnVaUlu9rswt-v7WRXIBBST_FhrET5nHlkvpkB-KQ9clfllkq0hgrhM2pNxWilrDWjzGHmXBo2ofp9fXtrvq_Bl1UtDCIm8hkexGXK5fuJW8RfZYfBGxQZL57AUynCoq3WWuUMhORdcXRBQxiRdTlMzszh8Ox6EGlc5iBYN1bo7A8rlMaq_KWLk4E5ff24R3sDrzpHkhy1yL-FNaw3YeO39oKb8DzRO12zBf6mbhbTqBYa9OQr4pQMxpNfJDqAixmSZWsSElxY0hYcBKl5omnV5L4mqU6XDts-VmNyjQFhJIPIfq_vyMWPoJWad3BzejI8PqfdfAXqcmnm1DFuR6Ly2vHca44KtXS8yJSzXrqAWKaxYoXninkpXcF1FS4eXWVsbHOTv4f1elLjByBiZKy2lfZeCRFDGNTWqdxyqbhBwXrAlu-7dF3z8TgDY1ymIISZMkJURojKDqIefF5tmbadN_4nvBUxWQl2cPRgdwlq2X2bTRk8WhW9NsO3_71rH16cD79dlVcX_csdeBnv0xJYdmF9PlvgHjxzP-f3zexjOoAPzw7Y3A |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Unsupervised+Deep+Slow+Feature+Analysis+for+Change+Detection+in+Multi-Temporal+Remote+Sensing+Images&rft.jtitle=IEEE+transactions+on+geoscience+and+remote+sensing&rft.au=Du%2C+Bo&rft.au=Ru%2C+Lixiang&rft.au=Wu%2C+Chen&rft.au=Zhang%2C+Liangpei&rft.date=2019-12-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0196-2892&rft.eissn=1558-0644&rft.volume=57&rft.issue=12&rft.spage=9976&rft_id=info:doi/10.1109%2FTGRS.2019.2930682&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0196-2892&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0196-2892&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0196-2892&client=summon |