Wider or Deeper: Revisiting the ResNet Model for Visual Recognition
•We further develop the unravelled view of ResNets, which helps us better understand their behaviours. We demonstrate this in the context of a training process, which is the key difference from the original version 1.•We propose a group of relatively shallow convolutional networks based on our new u...
Gespeichert in:
| Veröffentlicht in: | Pattern recognition Jg. 90; S. 119 - 133 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
Elsevier Ltd
01.06.2019
|
| Schlagworte: | |
| ISSN: | 0031-3203, 1873-5142 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | •We further develop the unravelled view of ResNets, which helps us better understand their behaviours. We demonstrate this in the context of a training process, which is the key difference from the original version 1.•We propose a group of relatively shallow convolutional networks based on our new understanding. Some of them perform comparably with the state-of-the-art approaches on the ImageNet classification dataset 2.•We evaluate the impact of using different networks on the performance of semantic image segmentation, and show these networks, as pre-trained features, can boost existing algorithms a lot.
The community has been going deeper and deeper in designing one cutting edge network after another, yet some works are there suggesting that we may have gone too far in this dimension. Some researchers unravelled a residual network into an exponentially wider one, and assorted the success of residual networks to fusing a large amount of relatively shallow models. Since some of their early claims are still not settled, we in this paper dig more on this topic, i.e., the unravelled view of residual networks. Based on that, we try to find a good compromise between the depth and width. Afterwards, we walk through a typical pipeline of developing a deep-learning-based algorithm. We start from a group of relatively shallow networks, which perform as well or even better than the current (much deeper) state-of-the-art models on the ImageNet classification dataset. Then, we initialize fully convolutional networks (FCNs) using our pre-trained models, and tune them for semantic image segmentation. Results show that the proposed networks, as pre-trained features, can boost existing methods a lot. Even without exhausting the sophistical techniques to improve the classic FCN model, we achieve comparable results with the best performers on four widely-used datasets, i.e., Cityscapes, PASCAL VOC, ADE20k and PASCAL-Context. The code and pre-trained models are released for public access11https://github.com/itijyou/ademxapp. |
|---|---|
| AbstractList | •We further develop the unravelled view of ResNets, which helps us better understand their behaviours. We demonstrate this in the context of a training process, which is the key difference from the original version 1.•We propose a group of relatively shallow convolutional networks based on our new understanding. Some of them perform comparably with the state-of-the-art approaches on the ImageNet classification dataset 2.•We evaluate the impact of using different networks on the performance of semantic image segmentation, and show these networks, as pre-trained features, can boost existing algorithms a lot.
The community has been going deeper and deeper in designing one cutting edge network after another, yet some works are there suggesting that we may have gone too far in this dimension. Some researchers unravelled a residual network into an exponentially wider one, and assorted the success of residual networks to fusing a large amount of relatively shallow models. Since some of their early claims are still not settled, we in this paper dig more on this topic, i.e., the unravelled view of residual networks. Based on that, we try to find a good compromise between the depth and width. Afterwards, we walk through a typical pipeline of developing a deep-learning-based algorithm. We start from a group of relatively shallow networks, which perform as well or even better than the current (much deeper) state-of-the-art models on the ImageNet classification dataset. Then, we initialize fully convolutional networks (FCNs) using our pre-trained models, and tune them for semantic image segmentation. Results show that the proposed networks, as pre-trained features, can boost existing methods a lot. Even without exhausting the sophistical techniques to improve the classic FCN model, we achieve comparable results with the best performers on four widely-used datasets, i.e., Cityscapes, PASCAL VOC, ADE20k and PASCAL-Context. The code and pre-trained models are released for public access11https://github.com/itijyou/ademxapp. |
| Author | Shen, Chunhua van den Hengel, Anton Wu, Zifeng |
| Author_xml | – sequence: 1 givenname: Zifeng orcidid: 0000-0002-0990-233X surname: Wu fullname: Wu, Zifeng email: Zifeng.Wu@adelaide.edu.au, zifeng.wu@adelaide.edu.au – sequence: 2 givenname: Chunhua surname: Shen fullname: Shen, Chunhua email: Chunhua.Shen@adelaide.edu.au – sequence: 3 givenname: Anton surname: van den Hengel fullname: van den Hengel, Anton email: anton.vandenhengel@adelaide.edu.au |
| BookMark | eNqFkNtKAzEQhoNUsK2-gRf7ArvOJHvshSD1CFVBPFyGNDupKetuSWLBtzelXnmhV8PM_N8c_gkb9UNPjJ0iZAhYnq2zjQp6WGUcsMkAM4DygI2xrkRaYM5HbAwgMBUcxBGbeL8GwCo2xmz-ZltyyeCSS6INuVnyRFvrbbD9KgnvFFP_QCG5H1rqEhN1r9Z_qi7W48I-6ob-mB0a1Xk6-YlT9nJ99Ty_TRePN3fzi0WqBZQhrbXi2NSKqhYgX0JNuha8WKpCaFGWeVVxowrFhcHGlByqptINqpYMLzQ2QkxZvp-r3eC9IyM3zn4o9yUR5M4IuZZ7I-TOCAkooxERm_3CtA1qd3hwynb_wed7mOJjW0tOem2p19RaRzrIdrB_D_gGoD59UA |
| CitedBy_id | crossref_primary_10_1049_ipr2_13026 crossref_primary_10_1109_TNNLS_2021_3080261 crossref_primary_10_1002_mp_14098 crossref_primary_10_1007_s10278_025_01470_1 crossref_primary_10_1007_s11760_023_02975_4 crossref_primary_10_1016_j_patcog_2021_108057 crossref_primary_10_1007_s10489_023_04940_7 crossref_primary_10_3390_sym17060961 crossref_primary_10_1016_j_tifs_2021_01_091 crossref_primary_10_3390_math11112466 crossref_primary_10_1016_j_patrec_2024_04_006 crossref_primary_10_1038_s41598_025_00600_7 crossref_primary_10_1007_s42979_022_01197_8 crossref_primary_10_3390_plants14050632 crossref_primary_10_3390_diagnostics12123171 crossref_primary_10_1088_1361_6560_aba87c crossref_primary_10_1093_eurheartj_ehab090 crossref_primary_10_1109_JSEN_2023_3256060 crossref_primary_10_1002_advs_202500981 crossref_primary_10_1007_s40747_023_01102_7 crossref_primary_10_1016_j_compgeo_2024_106518 crossref_primary_10_3390_pr12061075 crossref_primary_10_1007_s00521_021_06462_0 crossref_primary_10_1109_ACCESS_2020_3016116 crossref_primary_10_1007_s11042_022_14054_w crossref_primary_10_1109_TCSVT_2024_3383238 crossref_primary_10_1016_j_patrec_2024_12_016 crossref_primary_10_1016_j_mechmat_2020_103625 crossref_primary_10_1016_j_bspc_2023_104652 crossref_primary_10_1109_TIFS_2023_3278458 crossref_primary_10_25259_AJC_254_2024 crossref_primary_10_1109_TIV_2022_3149624 crossref_primary_10_1016_j_jvcir_2024_104386 crossref_primary_10_3390_rs15194681 crossref_primary_10_1007_s11227_025_07239_1 crossref_primary_10_1016_j_neucom_2020_08_014 crossref_primary_10_3390_math12233684 crossref_primary_10_1016_j_nicl_2025_103873 crossref_primary_10_3389_fnins_2020_00546 crossref_primary_10_1109_TCSVT_2024_3442310 crossref_primary_10_1007_s00521_023_08250_4 crossref_primary_10_1016_j_media_2019_101570 crossref_primary_10_1002_rcs_70094 crossref_primary_10_1007_s11042_022_13230_2 crossref_primary_10_1109_TNSRE_2022_3166224 crossref_primary_10_1155_2024_8729440 crossref_primary_10_1002_pca_3130 crossref_primary_10_3390_app10186147 crossref_primary_10_1111_jmi_13140 crossref_primary_10_1109_TPAMI_2021_3132068 crossref_primary_10_1016_j_mtbio_2025_102087 crossref_primary_10_1016_j_ijhydene_2023_03_219 crossref_primary_10_1109_LSP_2021_3079850 crossref_primary_10_1016_j_cirpj_2023_07_010 crossref_primary_10_1002_mp_17316 crossref_primary_10_1109_ACCESS_2024_3463391 crossref_primary_10_1109_LRA_2021_3126892 crossref_primary_10_3233_JIFS_211267 crossref_primary_10_1007_s11424_022_0307_5 crossref_primary_10_1016_j_jhazmat_2023_132853 crossref_primary_10_1016_j_trc_2024_104798 crossref_primary_10_1111_1750_3841_70379 crossref_primary_10_1016_j_asoc_2024_112649 crossref_primary_10_1016_j_biosystemseng_2021_02_001 crossref_primary_10_1016_j_neunet_2020_05_002 crossref_primary_10_1016_j_aca_2023_341869 crossref_primary_10_3390_s21237844 crossref_primary_10_1016_j_nutres_2021_05_011 crossref_primary_10_1109_ACCESS_2020_2997380 crossref_primary_10_3390_atmos12070869 crossref_primary_10_3390_s24092755 crossref_primary_10_1016_j_jvcir_2023_103800 crossref_primary_10_1007_s44174_024_00165_5 crossref_primary_10_1088_1361_665X_ad06e0 crossref_primary_10_1109_ACCESS_2019_2908685 crossref_primary_10_1109_TASE_2025_3554653 crossref_primary_10_3390_s23187957 crossref_primary_10_1002_dac_70023 crossref_primary_10_1038_s41598_024_65693_y crossref_primary_10_1109_TPAMI_2020_2983686 crossref_primary_10_1007_s11042_023_16461_z crossref_primary_10_1016_j_isprsjprs_2023_10_020 crossref_primary_10_1007_s11831_021_09667_7 crossref_primary_10_1029_2019EA001040 crossref_primary_10_1109_ACCESS_2020_3005444 crossref_primary_10_1016_j_ultrasmedbio_2025_08_014 crossref_primary_10_1049_iet_its_2019_0409 crossref_primary_10_1007_s11063_024_11713_x crossref_primary_10_3390_s22155492 crossref_primary_10_1016_j_engappai_2021_104587 crossref_primary_10_1007_s10044_021_00984_y crossref_primary_10_1109_TIP_2025_3554408 crossref_primary_10_1016_j_patcog_2022_109023 crossref_primary_10_1007_s11263_020_01383_2 crossref_primary_10_1142_S0218213025500113 crossref_primary_10_3390_ijgi11050299 crossref_primary_10_1109_MITP_2020_3042379 crossref_primary_10_1007_s10489_021_02393_4 crossref_primary_10_1016_j_patcog_2025_111975 crossref_primary_10_1108_JHTT_04_2023_0098 crossref_primary_10_20965_jaciii_2023_p1096 crossref_primary_10_1007_s10489_021_02547_4 crossref_primary_10_1109_TPAMI_2020_3026069 crossref_primary_10_1371_journal_pone_0276278 crossref_primary_10_1038_s41598_025_89404_3 crossref_primary_10_1109_TSM_2025_3561919 crossref_primary_10_1089_3dp_2021_0231 crossref_primary_10_1109_JBHI_2024_3450013 crossref_primary_10_1371_journal_pone_0304017 crossref_primary_10_1038_s41467_025_56276_0 crossref_primary_10_1109_TGRS_2024_3394750 crossref_primary_10_1109_TITS_2022_3228042 crossref_primary_10_1161_JAHA_123_030377 crossref_primary_10_32604_cmc_2023_042074 crossref_primary_10_1016_j_cmpb_2023_107643 crossref_primary_10_1109_TIM_2025_3552868 crossref_primary_10_1016_j_bspc_2024_106279 crossref_primary_10_3390_electronics10040427 crossref_primary_10_1016_j_biosystemseng_2022_09_006 crossref_primary_10_3390_s22249834 crossref_primary_10_3390_sym17010055 crossref_primary_10_1109_TNNLS_2022_3230821 crossref_primary_10_1155_2022_9672254 crossref_primary_10_3390_s25030724 crossref_primary_10_1016_j_asoc_2024_111749 crossref_primary_10_1007_s00521_022_07328_9 crossref_primary_10_3390_plants11060768 crossref_primary_10_1016_j_patcog_2019_107038 crossref_primary_10_3390_min12040455 crossref_primary_10_1016_j_ecoinf_2022_101808 crossref_primary_10_1016_j_microc_2021_106545 crossref_primary_10_1038_s41598_025_09775_5 crossref_primary_10_1016_j_saa_2023_122686 crossref_primary_10_1007_s10489_022_04085_z crossref_primary_10_1117_1_JEI_32_3_031803 crossref_primary_10_1016_j_microc_2024_111538 crossref_primary_10_32604_cmc_2022_024590 crossref_primary_10_1007_s00521_021_06651_x crossref_primary_10_1016_j_engappai_2025_111100 crossref_primary_10_1016_j_patrec_2022_11_024 crossref_primary_10_1088_1361_6501_ad2969 crossref_primary_10_1016_j_lwt_2022_113490 crossref_primary_10_1109_JIOT_2023_3339722 crossref_primary_10_3390_info11030128 crossref_primary_10_1109_TNNLS_2020_3006524 crossref_primary_10_1049_gtd2_70104 crossref_primary_10_3390_agriculture11111126 crossref_primary_10_1007_s11042_021_10802_6 crossref_primary_10_1109_ACCESS_2022_3140341 crossref_primary_10_1007_s10773_024_05669_w crossref_primary_10_1109_TIP_2022_3162101 crossref_primary_10_1111_exsy_13336 crossref_primary_10_1109_JSTARS_2022_3209967 crossref_primary_10_1109_TNNLS_2021_3107194 crossref_primary_10_1109_ACCESS_2021_3139850 crossref_primary_10_3233_JIFS_210569 crossref_primary_10_3390_s21144787 crossref_primary_10_1007_s11554_025_01709_8 crossref_primary_10_1016_j_knosys_2022_110220 crossref_primary_10_1155_2021_5527923 crossref_primary_10_1109_TPAMI_2024_3404422 crossref_primary_10_1007_s11676_021_01423_8 crossref_primary_10_1155_2022_4037625 crossref_primary_10_1007_s10044_021_01004_9 crossref_primary_10_1016_j_bspc_2025_108546 crossref_primary_10_1016_j_neucom_2024_127700 crossref_primary_10_1016_j_engappai_2025_111521 crossref_primary_10_1061_JCCEE5_CPENG_5065 crossref_primary_10_1016_j_flora_2025_152677 crossref_primary_10_1142_S0219691324500632 crossref_primary_10_1109_JBHI_2019_2912659 crossref_primary_10_1364_PRJ_527940 crossref_primary_10_3390_rs16132450 crossref_primary_10_1007_s44196_025_00772_0 crossref_primary_10_3233_MGS_210353 crossref_primary_10_1016_j_bspc_2025_108572 crossref_primary_10_1109_JSTARS_2022_3189052 crossref_primary_10_3390_w17060843 crossref_primary_10_1016_j_jocs_2021_101544 crossref_primary_10_2166_wqrj_2024_061 crossref_primary_10_1109_TIP_2025_3540265 crossref_primary_10_1016_j_neucom_2021_02_004 crossref_primary_10_1007_s11042_023_16441_3 crossref_primary_10_1007_s10462_022_10165_w crossref_primary_10_1109_TPAMI_2021_3059968 crossref_primary_10_1186_s13677_025_00738_9 crossref_primary_10_1016_j_jvcir_2025_104404 crossref_primary_10_1038_s41598_024_66346_w crossref_primary_10_3390_drones8110663 crossref_primary_10_1016_j_compag_2024_109071 crossref_primary_10_1016_j_csi_2022_103702 crossref_primary_10_1016_j_patcog_2021_107858 crossref_primary_10_1002_adma_202506367 crossref_primary_10_1016_j_patcog_2021_107851 crossref_primary_10_1007_s10462_022_10176_7 crossref_primary_10_1109_TAI_2022_3185179 crossref_primary_10_1088_2631_8695_ade7d1 crossref_primary_10_1155_2022_7512445 crossref_primary_10_3390_electronics12040791 crossref_primary_10_3390_su15108034 crossref_primary_10_3390_rs14236048 crossref_primary_10_1016_j_atech_2025_101027 crossref_primary_10_1016_j_jmapro_2023_09_053 crossref_primary_10_1155_2022_2836486 crossref_primary_10_3390_app151810013 crossref_primary_10_1109_LRA_2021_3062010 crossref_primary_10_1016_j_patcog_2020_107488 crossref_primary_10_3390_pr12010053 crossref_primary_10_1109_TIM_2025_3545166 crossref_primary_10_1016_j_asoc_2023_110315 crossref_primary_10_1109_TSC_2023_3304312 crossref_primary_10_1002_jrs_6466 crossref_primary_10_1109_TMM_2022_3152388 crossref_primary_10_3390_s25175294 crossref_primary_10_1109_ACCESS_2023_3289968 crossref_primary_10_1016_j_eswa_2023_122363 crossref_primary_10_1093_bib_bbaf166 crossref_primary_10_1109_TNNLS_2024_3373566 crossref_primary_10_1038_s41598_023_31564_1 crossref_primary_10_1177_18724981241291460 crossref_primary_10_1080_23311975_2025_2528161 crossref_primary_10_1016_j_patcog_2022_108902 crossref_primary_10_1080_02522667_2020_1715602 crossref_primary_10_1016_j_addma_2025_104887 crossref_primary_10_1080_02286203_2024_2389562 crossref_primary_10_1177_14759217251365856 crossref_primary_10_1016_j_engappai_2025_111849 crossref_primary_10_1016_j_compag_2023_108456 crossref_primary_10_1177_00207314211017469 crossref_primary_10_1016_j_neucom_2025_131131 crossref_primary_10_1016_j_isci_2025_112334 crossref_primary_10_3389_fgene_2023_1253934 crossref_primary_10_3390_s20236784 crossref_primary_10_1155_2022_7873300 crossref_primary_10_1109_TPAMI_2023_3273592 crossref_primary_10_1016_j_soildyn_2024_108896 crossref_primary_10_1109_ACCESS_2024_3510746 crossref_primary_10_3390_su14052869 crossref_primary_10_1109_TPAMI_2022_3168530 crossref_primary_10_3390_jimaging6090082 crossref_primary_10_7717_peerj_cs_1494 crossref_primary_10_1007_s00371_024_03386_3 crossref_primary_10_1177_14759217251324156 crossref_primary_10_1016_j_micron_2023_103448 crossref_primary_10_3390_diagnostics13020248 crossref_primary_10_1109_TPAMI_2021_3062772 crossref_primary_10_1016_j_patcog_2020_107475 crossref_primary_10_1088_1742_6596_2024_1_012011 crossref_primary_10_1016_j_aiig_2025_100107 crossref_primary_10_1109_ACCESS_2020_2968464 crossref_primary_10_1016_j_engappai_2020_103708 crossref_primary_10_1016_j_compbiomed_2023_107873 crossref_primary_10_3390_agronomy15061275 crossref_primary_10_1007_s13721_024_00450_9 crossref_primary_10_1155_2022_2309317 crossref_primary_10_1016_j_jfca_2023_105199 crossref_primary_10_3389_fmicb_2021_771428 crossref_primary_10_3390_agriculture14020217 crossref_primary_10_3389_fpls_2022_958940 crossref_primary_10_3390_electronics13020436 crossref_primary_10_1007_s00521_023_08826_0 crossref_primary_10_1016_j_energy_2024_132559 crossref_primary_10_3390_ijgi13100341 crossref_primary_10_1007_s11063_023_11214_3 crossref_primary_10_1080_10095020_2020_1785957 crossref_primary_10_1038_s41598_022_22264_3 crossref_primary_10_1109_TNNLS_2020_3043808 crossref_primary_10_1109_TCSVT_2024_3412996 crossref_primary_10_3390_plants13162277 crossref_primary_10_1109_TIM_2021_3106112 crossref_primary_10_1088_1402_4896_aceb9a crossref_primary_10_32604_cmc_2025_059149 crossref_primary_10_1002_cav_2228 crossref_primary_10_1007_s11042_021_11478_8 crossref_primary_10_3390_rs17060995 crossref_primary_10_1007_s00521_022_06913_2 crossref_primary_10_3390_make5040085 crossref_primary_10_1186_s13634_023_01069_0 crossref_primary_10_32604_cmc_2021_016736 crossref_primary_10_1016_j_eswa_2025_128040 crossref_primary_10_3389_fgene_2025_1622899 crossref_primary_10_1016_j_patcog_2022_108953 crossref_primary_10_1109_ACCESS_2024_3387832 crossref_primary_10_1038_s41598_023_47706_4 crossref_primary_10_1109_TNNLS_2022_3214216 crossref_primary_10_1145_3534932 crossref_primary_10_3390_min14030275 crossref_primary_10_1016_j_patcog_2023_109557 crossref_primary_10_1016_j_imavis_2022_104504 crossref_primary_10_1016_j_asoc_2024_112074 crossref_primary_10_1016_j_patcog_2022_108962 crossref_primary_10_1002_ima_23116 crossref_primary_10_1007_s10845_024_02518_9 crossref_primary_10_1016_j_cmpb_2021_106043 crossref_primary_10_1007_s11263_025_02442_2 crossref_primary_10_3390_rs13224597 crossref_primary_10_1109_JSEN_2023_3235830 crossref_primary_10_3390_drones7050287 crossref_primary_10_1016_j_patcog_2022_108724 crossref_primary_10_1109_TCOMM_2022_3217777 crossref_primary_10_3390_atmos16091054 crossref_primary_10_3390_fractalfract9060390 crossref_primary_10_3934_fods_2022007 crossref_primary_10_1109_TGRS_2020_3042507 crossref_primary_10_1109_TITS_2025_3559098 crossref_primary_10_1016_j_neucom_2020_06_110 crossref_primary_10_1016_j_jenvman_2022_114560 crossref_primary_10_1007_s11051_024_06111_2 crossref_primary_10_1016_j_bspc_2024_106970 crossref_primary_10_1080_19498276_2023_2251982 crossref_primary_10_3390_rs16203817 crossref_primary_10_1109_JTEHM_2023_3289990 crossref_primary_10_1109_TIP_2019_2945867 crossref_primary_10_1007_s00530_025_01855_w crossref_primary_10_1016_j_knosys_2025_113933 crossref_primary_10_1049_cit2_12356 crossref_primary_10_1109_TAFFC_2022_3205170 crossref_primary_10_1007_s00894_022_05271_z crossref_primary_10_1002_advs_202503059 crossref_primary_10_3390_bios12030167 crossref_primary_10_1002_cpe_6767 crossref_primary_10_1109_TBME_2023_3239687 crossref_primary_10_1109_TGRS_2022_3223921 crossref_primary_10_1016_j_imavis_2022_104571 crossref_primary_10_1080_10408347_2023_2189477 crossref_primary_10_1038_s41598_025_00341_7 crossref_primary_10_3724_SP_J_1089_2022_18956 crossref_primary_10_1109_TGRS_2023_3290242 crossref_primary_10_1049_rsn2_12457 crossref_primary_10_1088_1361_6501_ad8811 crossref_primary_10_1109_TPAMI_2023_3289308 crossref_primary_10_1080_12460125_2021_1969722 crossref_primary_10_1155_2022_4868435 crossref_primary_10_1007_s00138_023_01374_6 crossref_primary_10_1007_s11042_020_09398_0 crossref_primary_10_1088_1741_2552_ace5dc crossref_primary_10_1155_2023_6928871 crossref_primary_10_3390_s22239364 crossref_primary_10_1007_s42791_023_00049_7 crossref_primary_10_1038_s42256_021_00420_0 crossref_primary_10_1016_j_inffus_2024_102608 crossref_primary_10_1007_s11760_023_02754_1 crossref_primary_10_1007_s11432_021_3590_1 crossref_primary_10_1109_JSEN_2023_3304062 crossref_primary_10_1186_s40494_022_00644_2 crossref_primary_10_1016_j_compbiomed_2021_104608 crossref_primary_10_3390_rs13224712 crossref_primary_10_1016_j_measurement_2022_111003 crossref_primary_10_1109_TSMC_2022_3166397 crossref_primary_10_1155_2021_1991471 crossref_primary_10_1038_s41467_024_48747_7 crossref_primary_10_1016_j_chaos_2025_117098 crossref_primary_10_1109_TIP_2023_3343112 crossref_primary_10_3390_rs13010089 crossref_primary_10_1016_j_engappai_2024_109890 crossref_primary_10_1016_j_ress_2025_111347 crossref_primary_10_1016_j_neucom_2019_11_019 crossref_primary_10_3390_biology12010016 crossref_primary_10_3390_rs12060959 crossref_primary_10_1016_j_cmpb_2024_108280 crossref_primary_10_1016_j_rse_2023_113944 crossref_primary_10_1109_ACCESS_2020_2966497 crossref_primary_10_1371_journal_pone_0309126 crossref_primary_10_1007_s13369_025_10223_9 crossref_primary_10_3390_electronics12040918 crossref_primary_10_3390_rs15204906 crossref_primary_10_1016_j_patcog_2022_108777 crossref_primary_10_1016_j_neucom_2019_11_042 crossref_primary_10_1109_ACCESS_2025_3539930 crossref_primary_10_1016_j_optcom_2023_129263 crossref_primary_10_1109_LSP_2024_3391623 crossref_primary_10_1007_s11063_022_10927_1 crossref_primary_10_1088_1742_6596_1757_1_012063 crossref_primary_10_1109_ACCESS_2019_2907071 crossref_primary_10_1109_TIP_2020_2978339 crossref_primary_10_3390_fire6120446 crossref_primary_10_3390_electronics9010028 crossref_primary_10_1109_TDSC_2025_3568217 crossref_primary_10_3389_fpls_2023_1219983 crossref_primary_10_1109_TRO_2022_3197106 crossref_primary_10_1109_TPAMI_2021_3083269 crossref_primary_10_1016_j_jflm_2023_102619 crossref_primary_10_1109_JPROC_2023_3275192 crossref_primary_10_1155_2022_2545958 crossref_primary_10_1080_08839514_2021_2014188 crossref_primary_10_1109_LGRS_2024_3386193 crossref_primary_10_1007_s11227_023_05293_1 crossref_primary_10_1016_j_patcog_2024_111080 crossref_primary_10_1016_j_vibspec_2022_103404 crossref_primary_10_1016_j_istruc_2025_110085 crossref_primary_10_1016_j_indcrop_2025_121401 crossref_primary_10_3389_fnins_2024_1368733 crossref_primary_10_1007_s00500_023_09110_y crossref_primary_10_1016_j_ijmedinf_2025_105806 crossref_primary_10_1016_j_bspc_2023_105374 crossref_primary_10_1109_ACCESS_2019_2956216 crossref_primary_10_32604_cmc_2024_047053 crossref_primary_10_3390_agriculture12091493 crossref_primary_10_1007_s11063_024_11459_6 crossref_primary_10_1016_j_cscm_2024_e03987 crossref_primary_10_1049_iet_cvi_2019_0289 crossref_primary_10_1016_j_jcrysgro_2022_126527 crossref_primary_10_1016_j_artmed_2020_101936 crossref_primary_10_1007_s11432_020_3065_4 crossref_primary_10_1007_s10489_025_06511_4 crossref_primary_10_1016_j_saa_2022_121137 crossref_primary_10_1109_TPAMI_2022_3209702 crossref_primary_10_1016_j_compenvurbsys_2023_101950 crossref_primary_10_1007_s11263_023_01919_2 crossref_primary_10_3390_plants13111581 crossref_primary_10_1109_TMM_2019_2918720 crossref_primary_10_3390_rs15020440 crossref_primary_10_1016_j_energy_2023_129698 crossref_primary_10_1109_TIM_2021_3113127 crossref_primary_10_1007_s11063_023_11408_9 crossref_primary_10_3390_app14031023 crossref_primary_10_3390_s24185900 crossref_primary_10_1016_j_compag_2024_109127 crossref_primary_10_1002_pca_3076 crossref_primary_10_3233_JIFS_201758 crossref_primary_10_1007_s40747_021_00529_0 crossref_primary_10_1155_2022_1828782 crossref_primary_10_1016_j_eswa_2022_118363 crossref_primary_10_1016_j_inffus_2023_101880 crossref_primary_10_1016_j_compmedimag_2023_102290 crossref_primary_10_1016_j_displa_2024_102834 crossref_primary_10_1016_j_ymssp_2024_112128 crossref_primary_10_1007_s11042_022_13710_5 crossref_primary_10_1109_TPAMI_2023_3301302 crossref_primary_10_3390_electronics12041055 crossref_primary_10_1007_s11227_023_05820_0 crossref_primary_10_1016_j_measurement_2022_111760 crossref_primary_10_1016_j_dsp_2024_104614 crossref_primary_10_1016_j_ejrad_2021_109878 crossref_primary_10_1016_j_jss_2023_07_017 crossref_primary_10_1109_TGRS_2022_3193458 crossref_primary_10_4018_IJIIT_383644 crossref_primary_10_1093_comjnl_bxae106 crossref_primary_10_1109_TIP_2024_3444190 crossref_primary_10_1109_TETCI_2021_3058672 crossref_primary_10_3390_rs15184517 crossref_primary_10_1016_j_neunet_2022_10_023 crossref_primary_10_1109_TITS_2020_2980426 crossref_primary_10_1007_s11760_024_03477_7 crossref_primary_10_3390_diagnostics11020315 crossref_primary_10_1016_j_artmed_2022_102374 crossref_primary_10_1109_ACCESS_2024_3377689 crossref_primary_10_1109_ACCESS_2021_3076074 crossref_primary_10_1109_JSEN_2024_3400519 crossref_primary_10_1016_j_neucom_2020_07_128 crossref_primary_10_1038_s41598_022_22204_1 crossref_primary_10_1109_TMM_2023_3326300 crossref_primary_10_1016_j_knosys_2021_107296 crossref_primary_10_1109_TGRS_2023_3344283 crossref_primary_10_1007_s13748_019_00203_0 crossref_primary_10_1007_s42835_021_00880_9 crossref_primary_10_1016_j_patcog_2020_107611 crossref_primary_10_1016_j_sna_2024_115521 crossref_primary_10_1007_s12243_024_01029_1 crossref_primary_10_1038_s41598_025_91293_5 crossref_primary_10_3390_app14135466 crossref_primary_10_1038_s40494_025_01929_y crossref_primary_10_3390_app12104898 crossref_primary_10_1109_TGRS_2019_2951820 crossref_primary_10_1007_s11263_022_01590_z crossref_primary_10_1007_s11263_025_02372_z crossref_primary_10_1016_j_eswa_2023_122704 crossref_primary_10_3390_rs14040866 crossref_primary_10_1016_j_neucom_2022_07_069 crossref_primary_10_3390_diagnostics15091072 crossref_primary_10_1109_TCSVT_2020_3037234 crossref_primary_10_1016_j_patcog_2021_107903 crossref_primary_10_1007_s42979_023_01859_1 crossref_primary_10_3390_s22166249 crossref_primary_10_3390_s20061737 crossref_primary_10_1016_j_foodchem_2024_141529 crossref_primary_10_1109_TCSVT_2022_3227716 crossref_primary_10_1109_TPAMI_2024_3474094 crossref_primary_10_1007_s11760_025_03866_6 crossref_primary_10_1109_ACCESS_2023_3302353 crossref_primary_10_1038_s41598_024_69965_5 crossref_primary_10_1007_s13748_024_00354_9 crossref_primary_10_1016_j_neucom_2022_07_054 crossref_primary_10_1016_j_cmpb_2020_105766 crossref_primary_10_1109_TNNLS_2022_3169779 crossref_primary_10_1016_j_infrared_2022_104303 crossref_primary_10_1007_s13721_023_00435_0 crossref_primary_10_1007_s42979_020_00153_8 crossref_primary_10_1017_S0269888924000080 crossref_primary_10_1088_1361_6501_adc9d7 crossref_primary_10_1166_jmihi_2021_3865 crossref_primary_10_1049_iet_ipr_2020_0612 crossref_primary_10_1108_ECAM_06_2024_0821 crossref_primary_10_1016_j_rse_2023_113584 crossref_primary_10_1109_JSTARS_2020_3020733 crossref_primary_10_1016_j_autcon_2023_105098 crossref_primary_10_3390_electronics11050799 crossref_primary_10_1007_s10479_022_04833_x crossref_primary_10_1016_j_isatra_2022_02_048 crossref_primary_10_1109_TNNLS_2022_3144003 crossref_primary_10_1016_j_knosys_2021_106843 crossref_primary_10_1109_TIM_2022_3221731 crossref_primary_10_1186_s13007_025_01366_9 crossref_primary_10_1109_TITS_2020_2984894 crossref_primary_10_1109_JSEN_2024_3409769 crossref_primary_10_1007_s11760_023_02629_5 crossref_primary_10_1109_TPAMI_2022_3218275 crossref_primary_10_1186_s13018_024_05265_y crossref_primary_10_1007_s00521_025_11313_3 crossref_primary_10_1016_j_asoc_2024_111873 crossref_primary_10_1016_j_patcog_2020_107659 crossref_primary_10_3390_app12136423 crossref_primary_10_1021_acsnano_4c18377 crossref_primary_10_1080_19479832_2024_2330637 crossref_primary_10_1109_TGRS_2020_3009143 crossref_primary_10_1109_TGRS_2021_3088902 crossref_primary_10_3390_cancers15143604 crossref_primary_10_1016_j_envsoft_2025_106496 crossref_primary_10_1016_j_jvcir_2025_104576 crossref_primary_10_3390_electronics14061138 crossref_primary_10_1109_JIOT_2019_2940709 crossref_primary_10_1016_j_bspc_2022_104209 crossref_primary_10_1088_2058_6272_ace9af crossref_primary_10_1007_s11831_024_10098_3 crossref_primary_10_1016_j_jss_2021_10_017 crossref_primary_10_1109_TGRS_2021_3085889 crossref_primary_10_1088_1742_6596_2078_1_012019 crossref_primary_10_1007_s40747_024_01678_8 crossref_primary_10_1088_1742_6596_1651_1_012125 crossref_primary_10_1016_j_eswa_2023_120579 crossref_primary_10_1016_j_jksuci_2023_101859 crossref_primary_10_1109_LSP_2024_3406472 crossref_primary_10_3389_fpls_2025_1648434 crossref_primary_10_1109_TGRS_2022_3224477 crossref_primary_10_1117_1_JEI_33_4_043012 crossref_primary_10_3390_sym17081344 crossref_primary_10_1016_j_patcog_2019_107147 crossref_primary_10_1002_cbdv_202401228 crossref_primary_10_1145_3508393 crossref_primary_10_1016_j_artmed_2025_103097 crossref_primary_10_1016_j_isprsjprs_2022_04_028 crossref_primary_10_1186_s12911_023_02289_y crossref_primary_10_1111_jog_15788 crossref_primary_10_1016_j_neucom_2022_05_086 crossref_primary_10_1109_TGRS_2025_3583982 crossref_primary_10_1016_j_advengsoft_2024_103691 crossref_primary_10_1016_j_ymeth_2022_04_011 crossref_primary_10_3390_rs15051303 crossref_primary_10_1155_2021_7265644 crossref_primary_10_1016_j_bspc_2023_105826 crossref_primary_10_3390_app13179863 crossref_primary_10_1007_s11263_023_01796_9 crossref_primary_10_1088_1742_6596_1651_1_012184 crossref_primary_10_1109_TITS_2020_2995730 crossref_primary_10_1038_s41598_023_40581_z crossref_primary_10_1155_2022_5485117 crossref_primary_10_1016_j_cageo_2023_105358 crossref_primary_10_1016_j_jvcir_2025_104538 crossref_primary_10_1007_s00202_023_01915_2 crossref_primary_10_1029_2023WR036869 crossref_primary_10_3390_f15010165 crossref_primary_10_3390_sym13071116 crossref_primary_10_1109_ACCESS_2023_3314188 crossref_primary_10_3389_fimmu_2024_1478201 crossref_primary_10_1016_j_ins_2024_121076 crossref_primary_10_1016_j_jmgm_2021_108083 crossref_primary_10_1016_j_compag_2023_107944 crossref_primary_10_7759_cureus_37349 crossref_primary_10_1109_ACCESS_2024_3350176 crossref_primary_10_1109_TPAMI_2021_3138337 crossref_primary_10_1016_j_patcog_2020_107671 crossref_primary_10_1089_cmb_2023_0102 crossref_primary_10_1016_j_heliyon_2025_e42433 crossref_primary_10_1002_cpe_7031 crossref_primary_10_1016_j_eswa_2021_115673 crossref_primary_10_7717_peerj_cs_2417 crossref_primary_10_1007_s11042_022_13062_0 crossref_primary_10_1109_TMI_2025_3541115 crossref_primary_10_3390_cancers17010121 crossref_primary_10_1109_ACCESS_2024_3363233 crossref_primary_10_1109_JIOT_2024_3362851 crossref_primary_10_1371_journal_pone_0317863 crossref_primary_10_3390_agronomy11010174 crossref_primary_10_3390_app13053095 crossref_primary_10_1109_TPAMI_2020_3007032 crossref_primary_10_19159_tutad_1696120 crossref_primary_10_3390_app15031004 crossref_primary_10_3390_electronics12122730 crossref_primary_10_1109_ACCESS_2020_2969442 crossref_primary_10_1002_aisy_202401093 crossref_primary_10_1016_j_patrec_2021_03_032 crossref_primary_10_1016_j_neunet_2024_106417 crossref_primary_10_1007_s10489_022_03613_1 crossref_primary_10_1016_j_fuel_2024_131722 crossref_primary_10_3390_s21124211 crossref_primary_10_1007_s40846_021_00656_6 crossref_primary_10_1109_ACCESS_2023_3288999 crossref_primary_10_1049_cvi2_12250 crossref_primary_10_1109_TITS_2022_3205477 crossref_primary_10_3389_fpls_2024_1508449 crossref_primary_10_1007_s42835_021_00701_z crossref_primary_10_1007_s11432_019_2738_y crossref_primary_10_1109_TNNLS_2021_3054769 crossref_primary_10_1016_j_bspc_2024_106222 crossref_primary_10_1016_j_compbiomed_2020_103757 crossref_primary_10_1109_ACCESS_2022_3214316 crossref_primary_10_1016_j_comcom_2022_05_035 crossref_primary_10_1109_TCSVT_2024_3471875 crossref_primary_10_1016_j_neucom_2023_03_031 crossref_primary_10_1016_j_conengprac_2025_106491 crossref_primary_10_3390_s20154320 crossref_primary_10_1007_s12020_025_04198_8 crossref_primary_10_1088_1361_6501_ace124 crossref_primary_10_1109_JSEN_2024_3382345 crossref_primary_10_1007_s00500_021_05987_9 crossref_primary_10_1049_iet_ipr_2018_6622 crossref_primary_10_1088_1361_6560_acaeee crossref_primary_10_1109_TIP_2019_2957915 crossref_primary_10_1109_TPAMI_2021_3064837 crossref_primary_10_1007_s10489_024_05885_1 crossref_primary_10_1109_TCSVT_2023_3236432 crossref_primary_10_1016_j_rineng_2025_105524 crossref_primary_10_1016_j_apr_2021_101079 crossref_primary_10_1109_TIM_2024_3497168 crossref_primary_10_1109_ACCESS_2023_3266514 crossref_primary_10_3390_s20102907 crossref_primary_10_1109_TIP_2021_3136608 crossref_primary_10_3390_min12030380 crossref_primary_10_1016_j_rse_2024_114101 crossref_primary_10_1016_j_saa_2024_125421 crossref_primary_10_5194_jm_40_163_2021 crossref_primary_10_1007_s11760_025_04598_3 crossref_primary_10_1007_s11760_023_02677_x crossref_primary_10_1007_s11760_021_01882_w crossref_primary_10_1109_LGRS_2024_3365792 crossref_primary_10_3389_fnhum_2022_875201 crossref_primary_10_1016_j_compeleceng_2024_109310 crossref_primary_10_1109_ACCESS_2025_3581986 crossref_primary_10_1016_j_knosys_2021_106989 crossref_primary_10_1007_s10489_022_03383_w crossref_primary_10_1016_j_asoc_2021_107511 crossref_primary_10_1016_j_neucom_2025_129396 crossref_primary_10_1016_j_compag_2023_108538 crossref_primary_10_1016_j_imavis_2021_104309 crossref_primary_10_1038_s41598_025_11394_z crossref_primary_10_1016_j_ins_2020_06_023 crossref_primary_10_1088_1742_6596_2711_1_012020 crossref_primary_10_1109_TGRS_2024_3376577 crossref_primary_10_3390_foods11223568 crossref_primary_10_1016_j_procs_2020_01_067 crossref_primary_10_1109_TNNLS_2023_3269513 crossref_primary_10_3390_s23177390 crossref_primary_10_1007_s11227_024_06020_0 crossref_primary_10_1155_2022_8039281 crossref_primary_10_1111_mice_12836 crossref_primary_10_3390_sym14010175 crossref_primary_10_1007_s11694_024_02460_7 crossref_primary_10_2118_205376_PA crossref_primary_10_1109_ACCESS_2022_3213670 crossref_primary_10_1016_j_ins_2021_11_066 crossref_primary_10_3389_fmed_2022_955765 crossref_primary_10_1016_j_asoc_2023_110014 crossref_primary_10_7717_peerj_cs_2029 crossref_primary_10_1016_j_measurement_2025_118610 crossref_primary_10_3390_agriculture13071381 crossref_primary_10_1016_j_eswa_2023_122268 crossref_primary_10_1016_j_neucom_2025_131204 crossref_primary_10_1016_j_petrol_2021_109694 crossref_primary_10_3390_s22218189 crossref_primary_10_1007_s10462_020_09854_1 crossref_primary_10_1016_j_heliyon_2023_e20467 crossref_primary_10_1109_ACCESS_2024_3353048 crossref_primary_10_1080_10298436_2021_1883016 crossref_primary_10_3390_healthcare10010166 crossref_primary_10_3390_rs14133196 crossref_primary_10_1016_j_ins_2020_07_038 crossref_primary_10_1016_j_compbiomed_2023_107313 crossref_primary_10_1109_ACCESS_2023_3320685 crossref_primary_10_1007_s11276_024_03717_1 crossref_primary_10_1007_s11263_024_02224_2 crossref_primary_10_3390_app10186507 crossref_primary_10_1007_s10489_023_04616_2 crossref_primary_10_1016_j_procs_2023_12_003 crossref_primary_10_1016_j_patcog_2023_109666 crossref_primary_10_3389_fninf_2023_1086634 crossref_primary_10_1016_j_compbiomed_2023_107327 crossref_primary_10_1016_j_istruc_2024_106685 crossref_primary_10_1016_j_ins_2023_119364 crossref_primary_10_1016_j_knosys_2024_112684 crossref_primary_10_1002_adpr_202300308 crossref_primary_10_1016_j_eswa_2024_125621 crossref_primary_10_1016_j_scitotenv_2023_169233 crossref_primary_10_1016_j_microc_2020_105731 crossref_primary_10_1080_17452759_2024_2429530 crossref_primary_10_1007_s11432_019_2718_7 crossref_primary_10_1016_j_asoc_2023_111133 crossref_primary_10_1002_jemt_23713 crossref_primary_10_1016_j_compbiomed_2023_107334 crossref_primary_10_1016_j_media_2022_102487 crossref_primary_10_37031_jt_v22i2_490 crossref_primary_10_1016_j_neucom_2019_11_118 crossref_primary_10_1007_s10994_024_06547_6 crossref_primary_10_1093_jcde_qwaf057 crossref_primary_10_1111_1750_3841_16989 crossref_primary_10_1155_2022_9414567 crossref_primary_10_3390_electronics11111800 crossref_primary_10_1016_j_asoc_2024_112163 crossref_primary_10_1016_j_neucom_2022_08_065 crossref_primary_10_1007_s12652_021_03253_2 crossref_primary_10_1109_JSAC_2024_3413961 crossref_primary_10_1038_s41598_024_64636_x crossref_primary_10_4218_etrij_2024_0115 crossref_primary_10_1016_j_patcog_2021_108117 crossref_primary_10_1016_j_neucom_2022_07_022 crossref_primary_10_1007_s12650_024_00991_1 crossref_primary_10_1109_TBCAS_2022_3220758 crossref_primary_10_1016_j_compag_2020_105878 crossref_primary_10_1109_TNSRE_2021_3103210 crossref_primary_10_1016_j_nanoen_2022_108041 crossref_primary_10_1016_j_neucom_2021_08_157 crossref_primary_10_3390_molecules28135000 crossref_primary_10_1007_s11831_020_09471_9 crossref_primary_10_1007_s10618_024_01079_y crossref_primary_10_1177_14759217231194222 crossref_primary_10_3390_mi14010217 crossref_primary_10_3390_e24091213 crossref_primary_10_1371_journal_pone_0272317 crossref_primary_10_1134_S1054661821040039 crossref_primary_10_1016_j_procs_2023_12_202 crossref_primary_10_1177_14759217231192058 crossref_primary_10_1016_j_neucom_2019_10_123 crossref_primary_10_32604_cmc_2024_047143 crossref_primary_10_1016_j_compbiomed_2021_104771 crossref_primary_10_1093_jmicro_dfad049 crossref_primary_10_1016_j_neucom_2023_01_055 crossref_primary_10_1109_TCSVT_2023_3341728 crossref_primary_10_3390_life13020399 crossref_primary_10_1016_j_marenvres_2022_105829 crossref_primary_10_1016_j_neucom_2025_129553 crossref_primary_10_1109_TAI_2022_3221688 crossref_primary_10_1051_epjconf_202532801050 crossref_primary_10_1016_j_ajo_2022_02_020 crossref_primary_10_1007_s10921_024_01143_z crossref_primary_10_1016_j_cma_2025_117953 crossref_primary_10_1016_j_patcog_2022_108663 crossref_primary_10_1038_s41598_024_64621_4 crossref_primary_10_1016_j_engappai_2025_111089 crossref_primary_10_1038_s41598_023_41021_8 crossref_primary_10_1038_s41598_024_79206_4 crossref_primary_10_1002_mp_16093 crossref_primary_10_1016_j_cviu_2023_103795 crossref_primary_10_3390_s23010532 crossref_primary_10_1109_TCSVT_2024_3413778 crossref_primary_10_1016_j_neucom_2019_06_084 crossref_primary_10_1007_s11517_022_02673_2 crossref_primary_10_1088_1742_6596_1624_5_052011 crossref_primary_10_1155_2020_7490840 crossref_primary_10_1109_TPAMI_2020_2969421 crossref_primary_10_1109_ACCESS_2021_3108003 crossref_primary_10_1016_j_imavis_2022_104483 crossref_primary_10_1109_ACCESS_2021_3083577 crossref_primary_10_1186_s13636_024_00341_x crossref_primary_10_1109_TLT_2025_3530457 crossref_primary_10_1016_j_compbiomed_2021_104765 crossref_primary_10_1007_s11042_024_18622_0 crossref_primary_10_1016_j_neucom_2019_05_060 crossref_primary_10_1016_j_jss_2024_112058 crossref_primary_10_1016_j_jnca_2020_102590 crossref_primary_10_1109_TNNLS_2021_3131813 crossref_primary_10_3390_machines12110744 crossref_primary_10_1016_j_jcp_2019_109120 crossref_primary_10_1109_TITS_2020_3044672 crossref_primary_10_1016_j_neucom_2025_130504 crossref_primary_10_1021_acssensors_5c01439 crossref_primary_10_3390_s20113153 |
| Cites_doi | 10.1016/j.patcog.2018.03.015 10.1109/TPAMI.2017.2699184 10.1109/TPAMI.2017.2708714 10.1016/j.patcog.2016.12.005 10.1016/j.cviu.2017.05.007 10.1007/s11263-014-0733-5 10.1016/j.patcog.2018.03.003 10.1007/s11263-015-0816-y 10.1109/TPAMI.2017.2723009 10.1109/TPAMI.2016.2572683 10.1109/TPAMI.2016.2577031 |
| ContentType | Journal Article |
| Copyright | 2019 Elsevier Ltd |
| Copyright_xml | – notice: 2019 Elsevier Ltd |
| DBID | AAYXX CITATION |
| DOI | 10.1016/j.patcog.2019.01.006 |
| DatabaseName | CrossRef |
| DatabaseTitle | CrossRef |
| DatabaseTitleList | |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 1873-5142 |
| EndPage | 133 |
| ExternalDocumentID | 10_1016_j_patcog_2019_01_006 S0031320319300135 |
| GroupedDBID | --K --M -D8 -DT -~X .DC .~1 0R~ 123 1B1 1RT 1~. 1~5 29O 4.4 457 4G. 53G 5VS 7-5 71M 8P~ 9JN AABNK AACTN AAEDT AAEDW AAIAV AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AAXUO AAYFN ABBOA ABEFU ABFNM ABFRF ABHFT ABJNI ABMAC ABTAH ABXDB ABYKQ ACBEA ACDAQ ACGFO ACGFS ACNNM ACRLP ACZNC ADBBV ADEZE ADJOM ADMUD ADMXK ADTZH AEBSH AECPX AEFWE AEKER AENEX AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJBFU AJOXV ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ AOUOD ASPBG AVWKF AXJTR AZFZN BJAXD BKOJK BLXMC CS3 DU5 EBS EFJIC EFLBG EJD EO8 EO9 EP2 EP3 F0J F5P FD6 FDB FEDTE FGOYB FIRID FNPLU FYGXN G-Q G8K GBLVA GBOLZ HLZ HVGLF HZ~ H~9 IHE J1W JJJVA KOM KZ1 LG9 LMP LY1 M41 MO0 N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. Q38 R2- RIG RNS ROL RPZ SBC SDF SDG SDP SDS SES SEW SPC SPCBC SST SSV SSZ T5K TN5 UNMZH VOH WUQ XJE XPP ZMT ZY4 ~G- 9DU AATTM AAXKI AAYWO AAYXX ABDPE ABWVN ACLOT ACRPL ACVFH ADCNI ADNMO AEIPS AEUPX AFJKZ AFPUW AGQPQ AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP CITATION EFKBS ~HD |
| ID | FETCH-LOGICAL-c306t-8ca2198ae7d004b08ec8325ba53c3664772fa5a23f19f620797c91adef25c1933 |
| ISICitedReferencesCount | 1059 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000463130400011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0031-3203 |
| IngestDate | Sat Nov 29 03:52:23 EST 2025 Tue Nov 18 20:41:47 EST 2025 Fri Feb 23 02:25:26 EST 2024 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | Semantic segmentation Residual network Image classification |
| Language | English |
| LinkModel | OpenURL |
| MergedId | FETCHMERGED-LOGICAL-c306t-8ca2198ae7d004b08ec8325ba53c3664772fa5a23f19f620797c91adef25c1933 |
| ORCID | 0000-0002-0990-233X |
| PageCount | 15 |
| ParticipantIDs | crossref_primary_10_1016_j_patcog_2019_01_006 crossref_citationtrail_10_1016_j_patcog_2019_01_006 elsevier_sciencedirect_doi_10_1016_j_patcog_2019_01_006 |
| PublicationCentury | 2000 |
| PublicationDate | June 2019 2019-06-00 |
| PublicationDateYYYYMMDD | 2019-06-01 |
| PublicationDate_xml | – month: 06 year: 2019 text: June 2019 |
| PublicationDecade | 2010 |
| PublicationTitle | Pattern recognition |
| PublicationYear | 2019 |
| Publisher | Elsevier Ltd |
| Publisher_xml | – name: Elsevier Ltd |
| References | Bai, Shi, Zhang, Cai, Qi (bib0001) 2017; 66 Zhao, Shi, Qi, Wang, Jia (bib0009) 2017 Liu, Li, Luo, Loy, Tang (bib0035) 2015 Chen, Li, Li, Lin, Wang, Wang, Xiao, Xu, Zhang, Zhang (bib0022) 2016 Nair, Hinton (bib0018) 2010 Zhou, Zhao, Puig, Fidler, Barriuso, Torralba (bib0038) 2017 Ioffe, Szegedy (bib0017) 2015; 37 Krizhevsky, Sutskever, Hinton (bib0004) 2012 Chen, Papandreou, Kokkinos, Murphy, Yuille (bib0013) 2017; 40 Lin, Shen, van den Hengel, Reid (bib0036) 2017; 40 S. Gross, M. Wilber, Training and investigating residual nets, 2016, (http://torch.ch/blog/2016/02/04/resnets.html). Everingham, Eslami, van Gool, Williams, Winn, Zisserman (bib0027) 2015; 111 J. Wu, C.-W. Xie, J.-H. Luo, Dense CNN learning with equivalent mappings, 2016, (CoRR abs/1605.07251). He, Zhang, Ren, Sun (bib0005) 2016 Veit, Wilber, Belongie (bib0011) 2016 Shi, Sapkota, Xing, Liu, Cui, Yang (bib0002) 2018; 81 Zhou, Hu, Wang (bib0003) 2018; 80 Szegedy, Ioffe, Vanhoucke, Alemi (bib0008) 2017 Zhou, Khosla, Lapedriza, Torralba, Oliva (bib0031) 2018; 40 Peng, Xiao, Li, Jiang, Zhang, Jia, Yu, Sun (bib0019) 2018 Ghiasi, Fowlkes (bib0040) 2016 Cordts, Omran, Ramos, Rehfeld, Enzweiler, Benenson, Franke, Roth, Schiele (bib0037) 2016 He, Zhang, Ren, Sun (bib0007) 2016 Ren, He, Girshick, Sun (bib0021) 2015; 39 Srivastava, Hinton, Krizhevsky, Sutskever, Salakhutdinov (bib0028) 2014; 15 Zheng, Jayasumana, Romera-Paredes, Vineet, Su, Du, Huang, Torr (bib0033) 2015 Yu, Koltun (bib0039) 2016 Hariharan, Arbeláez, Bourdev, Maji, Malik (bib0030) 2011 Dai, He, Sun (bib0012) 2016 Z. Wu, C. Shen, A. van den Hengel, Bridging category-level and instance-level semantic image segmentation, 2016, (CoRR abs/1605.06885). Zagoruyko, Komodakis (bib0015) 2016 Lin, Maire, Belongie, Hays, Perona, Dollár, Zitnick (bib0020) 2014 L.-C. Chen, G. Papandreou, F. Schroff, H. Adam, Rethinking atrous convolution for semantic image segmentation, 2017, (CoRR abs/1706.05587). Krizhevsky (bib0006) 2009 Dai, He, Sun (bib0042) 2015 Simonyan, Zisserman (bib0014) 2015 Long, Shelhamer, Darrell (bib0016) 2017; 39 Noh, Hong, Han (bib0034) 2015 Mottaghi, Chen, Liu, Cho, Lee, Fidler, Urtasun, Yuille (bib0041) 2014 Mishkin, Sergievskiy, Matas (bib0024) 2017; 161 Russakovsky, Deng, Su, Krause, Satheesh, Ma, Huang, Karpathy, Khosla, Bernstein, Berg, Fei-Fei (bib0010) 2015; 115 Chen, Papandreou, Kokkinos, Murphy, Yuille (bib0025) 2015 Ioffe (10.1016/j.patcog.2019.01.006_bib0017) 2015; 37 Lin (10.1016/j.patcog.2019.01.006_bib0036) 2017; 40 Chen (10.1016/j.patcog.2019.01.006_bib0022) 2016 Russakovsky (10.1016/j.patcog.2019.01.006_bib0010) 2015; 115 Krizhevsky (10.1016/j.patcog.2019.01.006_bib0004) 2012 Liu (10.1016/j.patcog.2019.01.006_bib0035) 2015 Noh (10.1016/j.patcog.2019.01.006_bib0034) 2015 He (10.1016/j.patcog.2019.01.006_bib0007) 2016 Zhao (10.1016/j.patcog.2019.01.006_bib0009) 2017 Dai (10.1016/j.patcog.2019.01.006_bib0012) 2016 Zheng (10.1016/j.patcog.2019.01.006_bib0033) 2015 Dai (10.1016/j.patcog.2019.01.006_bib0042) 2015 10.1016/j.patcog.2019.01.006_bib0032 Zagoruyko (10.1016/j.patcog.2019.01.006_bib0015) 2016 Cordts (10.1016/j.patcog.2019.01.006_bib0037) 2016 He (10.1016/j.patcog.2019.01.006_bib0005) 2016 Zhou (10.1016/j.patcog.2019.01.006_bib0003) 2018; 80 Mishkin (10.1016/j.patcog.2019.01.006_bib0024) 2017; 161 Ghiasi (10.1016/j.patcog.2019.01.006_bib0040) 2016 Mottaghi (10.1016/j.patcog.2019.01.006_bib0041) 2014 Krizhevsky (10.1016/j.patcog.2019.01.006_bib0006) 2009 Simonyan (10.1016/j.patcog.2019.01.006_bib0014) 2015 Yu (10.1016/j.patcog.2019.01.006_bib0039) 2016 Shi (10.1016/j.patcog.2019.01.006_bib0002) 2018; 81 Everingham (10.1016/j.patcog.2019.01.006_bib0027) 2015; 111 Zhou (10.1016/j.patcog.2019.01.006_bib0038) 2017 Hariharan (10.1016/j.patcog.2019.01.006_bib0030) 2011 Long (10.1016/j.patcog.2019.01.006_bib0016) 2017; 39 Veit (10.1016/j.patcog.2019.01.006_bib0011) 2016 Peng (10.1016/j.patcog.2019.01.006_bib0019) 2018 Zhou (10.1016/j.patcog.2019.01.006_bib0031) 2018; 40 Chen (10.1016/j.patcog.2019.01.006_bib0025) 2015 Chen (10.1016/j.patcog.2019.01.006_bib0013) 2017; 40 10.1016/j.patcog.2019.01.006_bib0023 Nair (10.1016/j.patcog.2019.01.006_bib0018) 2010 Ren (10.1016/j.patcog.2019.01.006_bib0021) 2015; 39 Srivastava (10.1016/j.patcog.2019.01.006_bib0028) 2014; 15 10.1016/j.patcog.2019.01.006_bib0026 Szegedy (10.1016/j.patcog.2019.01.006_bib0008) 2017 Bai (10.1016/j.patcog.2019.01.006_bib0001) 2017; 66 10.1016/j.patcog.2019.01.006_bib0029 Lin (10.1016/j.patcog.2019.01.006_bib0020) 2014 |
| References_xml | – volume: 37 start-page: 448 year: 2015 end-page: 456 ident: bib0017 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: Proc. Int. Conf. Mach. Learn. – volume: 81 start-page: 14 year: 2018 end-page: 22 ident: bib0002 article-title: Pairwise based deep ranking hashing for histopathology image classification and retrieval publication-title: Pattern Recogn. – start-page: 3213 year: 2016 end-page: 3223 ident: bib0037 article-title: The Cityscapes dataset for semantic urban scene understanding publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – start-page: 991 year: 2011 end-page: 998 ident: bib0030 article-title: Semantic contours from inverse detectors publication-title: Proc. IEEE Int. Conf. Comp. Vis. – start-page: 1377 year: 2015 end-page: 1385 ident: bib0035 article-title: Semantic image segmentation via deep parsing network publication-title: Proc. IEEE Int. Conf. Comp. Vis. – volume: 66 start-page: 437 year: 2017 end-page: 446 ident: bib0001 article-title: Text/non-text image classification in the wild with convolutional neural networks publication-title: Pattern Recogn. – volume: 115 start-page: 211 year: 2015 end-page: 252 ident: bib0010 article-title: ImageNet Large Scale Visual Recognition Challenge publication-title: Int. J. Comput. Vision – volume: 40 start-page: 834 year: 2017 end-page: 848 ident: bib0013 article-title: Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 111 start-page: 98 year: 2015 end-page: 136 ident: bib0027 article-title: The PASCAL visual object classes challenge: A retrospective publication-title: Int. J. Comput. Vision – reference: S. Gross, M. Wilber, Training and investigating residual nets, 2016, (http://torch.ch/blog/2016/02/04/resnets.html). – reference: L.-C. Chen, G. Papandreou, F. Schroff, H. Adam, Rethinking atrous convolution for semantic image segmentation, 2017, (CoRR abs/1706.05587). – start-page: 550 year: 2016 end-page: 558 ident: bib0011 article-title: Residual networks behave like ensembles of relatively shallow networks publication-title: Proc. Advances in Neural Inf. Process. Syst. – start-page: 1529 year: 2015 end-page: 1537 ident: bib0033 article-title: Conditional random fields as recurrent neural networks publication-title: Proc. IEEE Int. Conf. Comp. Vis. – start-page: 891 year: 2014 end-page: 898 ident: bib0041 article-title: The role of context for object detection and semantic segmentation in the wild publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – year: 2015 ident: bib0014 article-title: Very deep convolutional networks for large-scale image recognition publication-title: Proc. Int. Conf. Learn. Representations – year: 2015 ident: bib0025 article-title: Semantic image segmentation with deep convolutional nets and fully connected CRFs publication-title: Proc. Int. Conf. Learn. Representations – start-page: 6230 year: 2017 end-page: 6239 ident: bib0009 article-title: Pyramid scene parsing network publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – year: 2018 ident: bib0019 article-title: MegDet: A large mini-batch object detector publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – start-page: 1097 year: 2012 end-page: 1105 ident: bib0004 article-title: ImageNet classification with deep convolutional neural networks publication-title: Proc. Advances in Neural Inf. Process. Syst. – volume: 40 start-page: 1352 year: 2017 end-page: 1366 ident: bib0036 article-title: Exploring context with deep structured models for semantic segmentation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 39 start-page: 1137 year: 2015 end-page: 1149 ident: bib0021 article-title: Faster R-CNN: Towards real-time object detection with region proposal networks publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 40 start-page: 1452 year: 2018 end-page: 1464 ident: bib0031 article-title: Places: A 10 million image database for scene understanding publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – start-page: 4278 year: 2017 end-page: 4284 ident: bib0008 article-title: Inception-v4, Inception-Resnet and the impact of residual connections on learning publication-title: Proc. AAAI Conf. on Artificial Intell. – start-page: 3150 year: 2016 end-page: 3158 ident: bib0012 article-title: Instance-aware semantic segmentation via multi-task network cascades publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – start-page: 740 year: 2014 end-page: 755 ident: bib0020 article-title: Microsoft COCO: Common objects in context publication-title: Proc. Eur. Conf. Comp. Vis. – start-page: 1520 year: 2015 end-page: 1528 ident: bib0034 article-title: Learning deconvolution network for semantic segmentation publication-title: Proc. IEEE Int. Conf. Comp. Vis. – start-page: 807 year: 2010 end-page: 814 ident: bib0018 article-title: Rectified linear units improve restricted boltzmann machines publication-title: Proc. Int. Conf. Mach. Learn. – year: 2016 ident: bib0039 article-title: Multi-scale context aggregation by dilated convolutions publication-title: Proc. Int. Conf. Learn. Representations – volume: 161 start-page: 11 year: 2017 end-page: 19 ident: bib0024 article-title: Systematic evaluation of CNN advances on the ImageNet publication-title: Comp. Vis. Image Understanding – reference: Z. Wu, C. Shen, A. van den Hengel, Bridging category-level and instance-level semantic image segmentation, 2016, (CoRR abs/1605.06885). – start-page: 5122 year: 2017 end-page: 5130 ident: bib0038 article-title: Scene parsing through ADE20K dataset publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – start-page: 519 year: 2016 end-page: 534 ident: bib0040 article-title: Laplacian pyramid reconstruction and refinement for semantic segmentation publication-title: Proc. Eur. Conf. Comp. Vis. – volume: 39 start-page: 640 year: 2017 end-page: 651 ident: bib0016 article-title: Fully convolutional networks for semantic segmentation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – start-page: 630 year: 2016 end-page: 645 ident: bib0007 article-title: Identity mappings in deep residual networks publication-title: Proc. Eur. Conf. Comp. Vis. – start-page: 87.1 year: 2016 end-page: 87.12 ident: bib0015 article-title: Wide residual networks publication-title: Proc. British Machine Vis. Conf. – start-page: 1635 year: 2015 end-page: 1643 ident: bib0042 article-title: BoxSup: Exploiting bounding boxes to supervise convolutional networks for semantic segmentation publication-title: Proc. IEEE Int. Conf. Comp. Vis. – volume: 15 start-page: 1929 year: 2014 end-page: 1958 ident: bib0028 article-title: Dropout: A simple way to prevent neural networks from overfitting publication-title: J. Mach. Learn. Res. – volume: 80 start-page: 118 year: 2018 end-page: 128 ident: bib0003 article-title: Deep super-class learning for long-tail distributed image classification publication-title: Pattern Recogn. – reference: J. Wu, C.-W. Xie, J.-H. Luo, Dense CNN learning with equivalent mappings, 2016, (CoRR abs/1605.07251). – start-page: 770 year: 2016 end-page: 778 ident: bib0005 article-title: Deep residual learning for image recognition publication-title: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. – year: 2016 ident: bib0022 article-title: MXNet: A flexible and efficient machine learning library for heterogeneous distributed systems publication-title: Proc. Advances in Neural Inf. Process. Syst., Workshop on Mach. Learn. Syst. – year: 2009 ident: bib0006 article-title: Learning multiple layers of features from tiny images publication-title: Technical Report – volume: 37 start-page: 448 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0017 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift – start-page: 4278 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0008 article-title: Inception-v4, Inception-Resnet and the impact of residual connections on learning – ident: 10.1016/j.patcog.2019.01.006_bib0029 – start-page: 1635 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0042 article-title: BoxSup: Exploiting bounding boxes to supervise convolutional networks for semantic segmentation – volume: 81 start-page: 14 year: 2018 ident: 10.1016/j.patcog.2019.01.006_bib0002 article-title: Pairwise based deep ranking hashing for histopathology image classification and retrieval publication-title: Pattern Recogn. doi: 10.1016/j.patcog.2018.03.015 – start-page: 630 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0007 article-title: Identity mappings in deep residual networks – volume: 40 start-page: 834 issue: 4 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0013 article-title: Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2699184 – volume: 40 start-page: 1352 issue: 6 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0036 article-title: Exploring context with deep structured models for semantic segmentation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2708714 – start-page: 550 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0011 article-title: Residual networks behave like ensembles of relatively shallow networks – year: 2018 ident: 10.1016/j.patcog.2019.01.006_bib0019 article-title: MegDet: A large mini-batch object detector – start-page: 1097 year: 2012 ident: 10.1016/j.patcog.2019.01.006_bib0004 article-title: ImageNet classification with deep convolutional neural networks – volume: 66 start-page: 437 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0001 article-title: Text/non-text image classification in the wild with convolutional neural networks publication-title: Pattern Recogn. doi: 10.1016/j.patcog.2016.12.005 – volume: 161 start-page: 11 issue: C year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0024 article-title: Systematic evaluation of CNN advances on the ImageNet publication-title: Comp. Vis. Image Understanding doi: 10.1016/j.cviu.2017.05.007 – start-page: 1377 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0035 article-title: Semantic image segmentation via deep parsing network – year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0039 article-title: Multi-scale context aggregation by dilated convolutions – volume: 111 start-page: 98 issue: 1 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0027 article-title: The PASCAL visual object classes challenge: A retrospective publication-title: Int. J. Comput. Vision doi: 10.1007/s11263-014-0733-5 – start-page: 3213 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0037 article-title: The Cityscapes dataset for semantic urban scene understanding – year: 2009 ident: 10.1016/j.patcog.2019.01.006_bib0006 article-title: Learning multiple layers of features from tiny images – start-page: 6230 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0009 article-title: Pyramid scene parsing network – start-page: 807 year: 2010 ident: 10.1016/j.patcog.2019.01.006_bib0018 article-title: Rectified linear units improve restricted boltzmann machines – start-page: 5122 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0038 article-title: Scene parsing through ADE20K dataset – volume: 80 start-page: 118 year: 2018 ident: 10.1016/j.patcog.2019.01.006_bib0003 article-title: Deep super-class learning for long-tail distributed image classification publication-title: Pattern Recogn. doi: 10.1016/j.patcog.2018.03.003 – ident: 10.1016/j.patcog.2019.01.006_bib0023 – volume: 15 start-page: 1929 issue: 1 year: 2014 ident: 10.1016/j.patcog.2019.01.006_bib0028 article-title: Dropout: A simple way to prevent neural networks from overfitting publication-title: J. Mach. Learn. Res. – start-page: 87.1 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0015 article-title: Wide residual networks – start-page: 770 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0005 article-title: Deep residual learning for image recognition – start-page: 3150 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0012 article-title: Instance-aware semantic segmentation via multi-task network cascades – year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0022 article-title: MXNet: A flexible and efficient machine learning library for heterogeneous distributed systems – start-page: 1529 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0033 article-title: Conditional random fields as recurrent neural networks – year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0025 article-title: Semantic image segmentation with deep convolutional nets and fully connected CRFs – start-page: 891 year: 2014 ident: 10.1016/j.patcog.2019.01.006_bib0041 article-title: The role of context for object detection and semantic segmentation in the wild – year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0014 article-title: Very deep convolutional networks for large-scale image recognition – ident: 10.1016/j.patcog.2019.01.006_bib0032 – start-page: 1520 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0034 article-title: Learning deconvolution network for semantic segmentation – volume: 115 start-page: 211 issue: 3 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0010 article-title: ImageNet Large Scale Visual Recognition Challenge publication-title: Int. J. Comput. Vision doi: 10.1007/s11263-015-0816-y – volume: 40 start-page: 1452 issue: 6 year: 2018 ident: 10.1016/j.patcog.2019.01.006_bib0031 article-title: Places: A 10 million image database for scene understanding publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2723009 – volume: 39 start-page: 640 issue: 4 year: 2017 ident: 10.1016/j.patcog.2019.01.006_bib0016 article-title: Fully convolutional networks for semantic segmentation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2016.2572683 – volume: 39 start-page: 1137 issue: 6 year: 2015 ident: 10.1016/j.patcog.2019.01.006_bib0021 article-title: Faster R-CNN: Towards real-time object detection with region proposal networks publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2016.2577031 – start-page: 740 year: 2014 ident: 10.1016/j.patcog.2019.01.006_bib0020 article-title: Microsoft COCO: Common objects in context – start-page: 991 year: 2011 ident: 10.1016/j.patcog.2019.01.006_bib0030 article-title: Semantic contours from inverse detectors – ident: 10.1016/j.patcog.2019.01.006_bib0026 – start-page: 519 year: 2016 ident: 10.1016/j.patcog.2019.01.006_bib0040 article-title: Laplacian pyramid reconstruction and refinement for semantic segmentation |
| SSID | ssj0017142 |
| Score | 2.723963 |
| Snippet | •We further develop the unravelled view of ResNets, which helps us better understand their behaviours. We demonstrate this in the context of a training... |
| SourceID | crossref elsevier |
| SourceType | Enrichment Source Index Database Publisher |
| StartPage | 119 |
| SubjectTerms | Image classification Residual network Semantic segmentation |
| Title | Wider or Deeper: Revisiting the ResNet Model for Visual Recognition |
| URI | https://dx.doi.org/10.1016/j.patcog.2019.01.006 |
| Volume | 90 |
| WOSCitedRecordID | wos000463130400011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVESC databaseName: Elsevier SD Freedom Collection Journals 2021 customDbUrl: eissn: 1873-5142 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017142 issn: 0031-3203 databaseCode: AIEXJ dateStart: 19950101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3PT9swFLYm4MCFMTYE25h82A1lcuw4jrkhfmjjUKEJtt4ix3WgaEqrNp3483m2X0I3EIPDLlGVJk6b9-V7z_F73yPks1DK6lyKJDOZggmK1UmVVzapM5FlXHFTWxaaTajBoBgO9TmmNc9DOwHVNMXtrZ7-V1PDPjC2L519gbn7QWEHfAajwxbMDttnGf6nL6zzaebHzk3dLGa8-RLytquM-u7mA9eGNmihenH_x3geRfYxmQhNhTHreZDg9GUvf37tqXwRFjfGtUMHGMUecSF_0Vwvetb3dVLAceDnfBot6hbgSPjWwRc6ddlRHZOKNBGciWUmjY0_kQpTpMLoVdMod_GAsOO7g5svU3A8kyufaqeDjCp7RB_7L7_VZxN2iWo3ZRyl9KOULC2DFvsqV1ID360efjsZnvUrTCrNopI8_o-urDLk_j38NY-HLUuhyMUm2cA5BD2Mtn9DXrlmi7zu-nNQpOu35ChAgU5mNELhgN4DgQIQaAQCDUCgAAQagUCXgPCOXJ6eXBx9TbBpRmJh9tcmhTXghArj1AgeiooVzgJpy8pIYUXuy455baThok51nXOmNDysqRm5mksL0bzYJivNpHE7hFaCFyMmORyeZ3pkjJbM1FIa39i1ytkuEd09KS0qyvvGJr_KpyyyS5L-rGlUVPnH8aq73SVGhTHaKwFDT575_oVX-kDW76H-kay0s4XbI2v2dzuezz4hgO4AK6eC1w |
| linkProvider | Elsevier |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Wider+or+Deeper%3A+Revisiting+the+ResNet+Model+for+Visual+Recognition&rft.jtitle=Pattern+recognition&rft.au=Wu%2C+Zifeng&rft.au=Shen%2C+Chunhua&rft.au=van+den+Hengel%2C+Anton&rft.date=2019-06-01&rft.issn=0031-3203&rft.volume=90&rft.spage=119&rft.epage=133&rft_id=info:doi/10.1016%2Fj.patcog.2019.01.006&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_patcog_2019_01_006 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0031-3203&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0031-3203&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0031-3203&client=summon |