Attribute-Based Classification for Zero-Shot Visual Object Categorization
We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This situation has hardly been studied in computer vision research, even though it occurs frequently; the world contains tens of thousands of differe...
Uložené v:
| Vydané v: | IEEE transactions on pattern analysis and machine intelligence Ročník 36; číslo 3; s. 453 - 465 |
|---|---|
| Hlavní autori: | , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Los Alamitos, CA
IEEE
01.03.2014
IEEE Computer Society The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Predmet: | |
| ISSN: | 0162-8828, 1939-3539, 2160-9292, 1939-3539 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This situation has hardly been studied in computer vision research, even though it occurs frequently; the world contains tens of thousands of different object classes, and image collections have been formed and suitably annotated for only a few of them. To tackle the problem, we introduce attribute-based classification: Objects are identified based on a high-level description that is phrased in terms of semantic attributes, such as the object's color or shape. Because the identification of each such property transcends the specific learning task at hand, the attribute classifiers can be prelearned independently, for example, from existing image data sets unrelated to the current task. Afterward, new classes can be detected based on their attribute representation, without the need for a new training phase. In this paper, we also introduce a new data set, Animals with Attributes, of over 30,000 images of 50 animal classes, annotated with 85 semantic attributes. Extensive experiments on this and two more data sets show that attribute-based classification indeed is able to categorize images without access to any training images of the target classes. |
|---|---|
| AbstractList | We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This situation has hardly been studied in computer vision research, even though it occurs frequently; the world contains tens of thousands of different object classes, and image collections have been formed and suitably annotated for only a few of them. To tackle the problem, we introduce attribute-based classification: Objects are identified based on a high-level description that is phrased in terms of semantic attributes, such as the object's color or shape. Because the identification of each such property transcends the specific learning task at hand, the attribute classifiers can be prelearned independently, for example, from existing image data sets unrelated to the current task. Afterward, new classes can be detected based on their attribute representation, without the need for a new training phase. In this paper, we also introduce a new data set, Animals with Attributes, of over 30,000 images of 50 animal classes, annotated with 85 semantic attributes. Extensive experiments on this and two more data sets show that attribute-based classification indeed is able to categorize images without access to any training images of the target classes. We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This situation has hardly been studied in computer vision research, even though it occurs frequently; the world contains tens of thousands of different object classes, and image collections have been formed and suitably annotated for only a few of them. To tackle the problem, we introduce attribute-based classification: Objects are identified based on a high-level description that is phrased in terms of semantic attributes, such as the object's color or shape. Because the identification of each such property transcends the specific learning task at hand, the attribute classifiers can be prelearned independently, for example, from existing image data sets unrelated to the current task. Afterward, new classes can be detected based on their attribute representation, without the need for a new training phase. In this paper, we also introduce a new data set, Animals with Attributes, of over 30,000 images of 50 animal classes, annotated with 85 semantic attributes. Extensive experiments on this and two more data sets show that attribute-based classification indeed is able to categorize images without access to any training images of the target classes.We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This situation has hardly been studied in computer vision research, even though it occurs frequently; the world contains tens of thousands of different object classes, and image collections have been formed and suitably annotated for only a few of them. To tackle the problem, we introduce attribute-based classification: Objects are identified based on a high-level description that is phrased in terms of semantic attributes, such as the object's color or shape. Because the identification of each such property transcends the specific learning task at hand, the attribute classifiers can be prelearned independently, for example, from existing image data sets unrelated to the current task. Afterward, new classes can be detected based on their attribute representation, without the need for a new training phase. In this paper, we also introduce a new data set, Animals with Attributes, of over 30,000 images of 50 animal classes, annotated with 85 semantic attributes. Extensive experiments on this and two more data sets show that attribute-based classification indeed is able to categorize images without access to any training images of the target classes. |
| Author | Nickisch, Hannes Harmeling, Stefan Lampert, Christoph H. |
| Author_xml | – sequence: 1 givenname: Christoph H. surname: Lampert fullname: Lampert, Christoph H. email: chl@ist.ac.at organization: Inst. of Sci. & Technol. Austria, Klosterneuburg, Austria – sequence: 2 givenname: Hannes surname: Nickisch fullname: Nickisch, Hannes email: hannes@nickisch.org organization: Philips Res., Hamburg, Germany – sequence: 3 givenname: Stefan surname: Harmeling fullname: Harmeling, Stefan email: stefan.harmeling@tuebingen.mpg.de organization: Max Planck Inst. for Intell. Syst., Tubingen, Germany |
| BackLink | http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=28402986$$DView record in Pascal Francis https://www.ncbi.nlm.nih.gov/pubmed/24457503$$D View this record in MEDLINE/PubMed |
| BookMark | eNqF0U1r3DAQBmARUpJNmmsvhWAIhV68HcmSLR03Sz8WUlJo2kMuZixLjRavlUjyof31VXa3KQRKT9LheUeamRNyOPrREPKKwpxSUO9uviw-r-YMaDWnHA7IjNEaSsUUOyQzoDUrpWTymJzEuAagXEB1RI4Z56LJ1xlZLVIKrpuSKS8xmr5YDhijs05jcn4srA_FrQm-_HrnU_HdxQmH4rpbG52KJSbzwwf3a0tfkhcWh2jO9ucp-fbh_c3yU3l1_XG1XFyVmgNLpaCccaqx4R0XPQpZ9wy7BjtZMUSQPbW9rIytwPYKuDBcAe36nJKoO4vVKXm7q3sf_MNkYmo3LmozDDgaP8WWCgZKQKPE_ylXrAHBVJPpxTO69lMYcyO5IPC6ylVZVud7NXUb07f3wW0w_Gz_zDODN3uAUeNgA47axb9O5hkoWWc33zkdfIzB2CdCoX1cbLtdbPu42PxNyAH-LKBd2s49BXTDv2OvdzFnjHl6oxYNpaqufgPo86zA |
| CODEN | ITPIDJ |
| CitedBy_id | crossref_primary_10_1109_TCYB_2021_3062830 crossref_primary_10_1109_TNNLS_2017_2727065 crossref_primary_10_1109_TIP_2021_3128325 crossref_primary_10_1109_TPAMI_2015_2408354 crossref_primary_10_1016_j_image_2020_116115 crossref_primary_10_3390_app9224749 crossref_primary_10_1007_s10489_022_04342_1 crossref_primary_10_1016_j_cviu_2019_07_004 crossref_primary_10_1049_tje2_12185 crossref_primary_10_1016_j_neucom_2022_04_114 crossref_primary_10_3390_make4010002 crossref_primary_10_1007_s44295_025_00070_7 crossref_primary_10_1016_j_jvcir_2020_103010 crossref_primary_10_1109_TCSVT_2020_2984666 crossref_primary_10_1109_TIP_2023_3247167 crossref_primary_10_3390_fi16120453 crossref_primary_10_1016_j_knosys_2017_12_022 crossref_primary_10_1109_TPAMI_2022_3183586 crossref_primary_10_3390_app12083760 crossref_primary_10_3390_app13031750 crossref_primary_10_1016_j_patrec_2017_05_012 crossref_primary_10_1016_j_eswa_2023_119642 crossref_primary_10_1080_24725854_2023_2263786 crossref_primary_10_1007_s11263_021_01454_y crossref_primary_10_1007_s13042_021_01421_6 crossref_primary_10_1109_LSP_2018_2857201 crossref_primary_10_1016_j_neucom_2022_06_071 crossref_primary_10_1109_TCSVT_2024_3359238 crossref_primary_10_1109_TIP_2020_3020383 crossref_primary_10_1109_TNNLS_2022_3201883 crossref_primary_10_1016_j_neucom_2020_09_065 crossref_primary_10_1007_s13735_024_00336_6 crossref_primary_10_1109_TPAMI_2016_2643667 crossref_primary_10_1016_j_neunet_2022_02_018 crossref_primary_10_1016_j_knosys_2020_105847 crossref_primary_10_1016_j_eswa_2023_120875 crossref_primary_10_1109_TNNLS_2022_3145962 crossref_primary_10_3390_app13127071 crossref_primary_10_1016_j_patcog_2023_109937 crossref_primary_10_1109_LSP_2017_2748392 crossref_primary_10_1109_TPAMI_2018_2857768 crossref_primary_10_1109_TKDE_2021_3109131 crossref_primary_10_1016_j_knosys_2018_06_018 crossref_primary_10_1155_2022_2717322 crossref_primary_10_1109_ACCESS_2020_3022805 crossref_primary_10_1109_TMM_2021_3079705 crossref_primary_10_1049_ise2_12088 crossref_primary_10_1007_s11263_019_01193_1 crossref_primary_10_1007_s10044_021_00992_y crossref_primary_10_1007_s11263_024_02144_1 crossref_primary_10_1109_TMM_2017_2759499 crossref_primary_10_1016_j_ergon_2025_103732 crossref_primary_10_1109_TCYB_2016_2612686 crossref_primary_10_1016_j_knosys_2018_06_034 crossref_primary_10_1109_TNNLS_2024_3379527 crossref_primary_10_1109_ACCESS_2023_3303640 crossref_primary_10_1155_2019_7390327 crossref_primary_10_1109_TMM_2015_2477680 crossref_primary_10_3390_rs14205191 crossref_primary_10_1016_j_imavis_2022_104392 crossref_primary_10_1109_TCSVT_2024_3414275 crossref_primary_10_1016_j_ins_2016_10_025 crossref_primary_10_1109_ACCESS_2019_2925093 crossref_primary_10_1109_TMM_2023_3258624 crossref_primary_10_1145_3510030 crossref_primary_10_1109_TCYB_2022_3164142 crossref_primary_10_1007_s00521_021_06461_1 crossref_primary_10_1109_TIM_2025_3545713 crossref_primary_10_1109_TPAMI_2021_3079511 crossref_primary_10_1109_TMM_2016_2582379 crossref_primary_10_1016_j_neucom_2020_11_061 crossref_primary_10_1016_j_bspc_2025_108035 crossref_primary_10_1109_ACCESS_2020_3015334 crossref_primary_10_1109_TIP_2016_2523340 crossref_primary_10_1016_j_sigpro_2017_03_023 crossref_primary_10_1088_1742_6596_1529_4_042076 crossref_primary_10_1109_TAI_2022_3208860 crossref_primary_10_1109_TIP_2025_3544481 crossref_primary_10_1007_s11263_020_01381_4 crossref_primary_10_1109_JPROC_2023_3279374 crossref_primary_10_1016_j_neucom_2021_04_031 crossref_primary_10_1109_TCYB_2018_2850750 crossref_primary_10_1109_TCYB_2021_3050803 crossref_primary_10_1109_TPAMI_2022_3200091 crossref_primary_10_3390_rs13152901 crossref_primary_10_1016_j_neucom_2020_07_128 crossref_primary_10_1109_TNNLS_2021_3104882 crossref_primary_10_1016_j_ins_2025_122069 crossref_primary_10_1016_j_engappai_2025_112321 crossref_primary_10_1016_j_compeleceng_2022_108068 crossref_primary_10_1016_j_ymssp_2025_112427 crossref_primary_10_1016_j_compag_2016_05_009 crossref_primary_10_1016_j_eswa_2024_125150 crossref_primary_10_1002_ail2_57 crossref_primary_10_1109_TPAMI_2018_2873701 crossref_primary_10_1007_s12652_020_02615_6 crossref_primary_10_1016_j_inffus_2020_08_017 crossref_primary_10_1007_s11390_020_9487_4 crossref_primary_10_1109_TCSVT_2024_3384756 crossref_primary_10_1109_TSMC_2018_2837670 crossref_primary_10_1109_TPAMI_2022_3195549 crossref_primary_10_1109_TII_2020_2988208 crossref_primary_10_1016_j_knosys_2022_109218 crossref_primary_10_1049_iet_cvi_2018_5131 crossref_primary_10_1109_TPAMI_2025_3560033 crossref_primary_10_1109_TCYB_2019_2928180 crossref_primary_10_1109_TMM_2021_3103605 crossref_primary_10_1016_j_compind_2022_103610 crossref_primary_10_3390_rs16111930 crossref_primary_10_1109_TFUZZ_2024_3363708 crossref_primary_10_1016_j_neunet_2019_09_029 crossref_primary_10_1016_j_engappai_2025_110151 crossref_primary_10_1016_j_ijhm_2025_104397 crossref_primary_10_1016_j_neucom_2020_10_080 crossref_primary_10_1016_j_ins_2021_06_085 crossref_primary_10_1007_s10462_023_10534_z crossref_primary_10_1162_neco_a_01096 crossref_primary_10_1016_j_image_2021_116614 crossref_primary_10_1016_j_neucom_2017_10_061 crossref_primary_10_1109_TITS_2018_2868168 crossref_primary_10_3934_aci_2022001 crossref_primary_10_1109_TKDE_2022_3150790 crossref_primary_10_1109_TMC_2025_3573168 crossref_primary_10_3390_info13120561 crossref_primary_10_1016_j_jvcir_2018_06_026 crossref_primary_10_3390_app10207234 crossref_primary_10_1007_s10994_018_5710_8 crossref_primary_10_3390_jimaging8060171 crossref_primary_10_1109_TKDE_2019_2942590 crossref_primary_10_1016_j_neunet_2022_09_018 crossref_primary_10_1109_ACCESS_2019_2962298 crossref_primary_10_1109_TPAMI_2018_2867870 crossref_primary_10_1007_s11263_025_02394_7 crossref_primary_10_1109_LSP_2021_3092227 crossref_primary_10_1109_TEVC_2023_3307245 crossref_primary_10_1109_TIM_2024_3378256 crossref_primary_10_1016_j_patcog_2016_10_034 crossref_primary_10_1007_s11042_019_7689_y crossref_primary_10_1007_s11042_016_3863_7 crossref_primary_10_1016_j_websem_2022_100757 crossref_primary_10_1371_journal_pone_0301692 crossref_primary_10_1109_TPAMI_2023_3257407 crossref_primary_10_1016_j_inffus_2020_08_020 crossref_primary_10_1016_j_knosys_2025_113466 crossref_primary_10_1007_s11263_016_0931_4 crossref_primary_10_1016_j_eswa_2025_126671 crossref_primary_10_1109_TCYB_2022_3165550 crossref_primary_10_1109_TNNLS_2020_3006322 crossref_primary_10_1162_neco_a_01071 crossref_primary_10_1016_j_eswa_2024_125165 crossref_primary_10_1109_TPAMI_2018_2836900 crossref_primary_10_1007_s11042_023_14790_7 crossref_primary_10_1109_TPAMI_2018_2877660 crossref_primary_10_1007_s11263_020_01350_x crossref_primary_10_1109_JPROC_2020_2989782 crossref_primary_10_1109_TIP_2018_2881926 crossref_primary_10_1016_j_patcog_2025_111989 crossref_primary_10_1049_iet_ipr_2019_0870 crossref_primary_10_1109_ACCESS_2023_3282932 crossref_primary_10_3390_rs12101676 crossref_primary_10_1007_s10032_018_0295_0 crossref_primary_10_1007_s11277_018_5723_4 crossref_primary_10_1016_j_patrec_2017_11_013 crossref_primary_10_1016_j_ins_2018_08_048 crossref_primary_10_1016_j_jprocont_2023_103004 crossref_primary_10_3390_electronics10161879 crossref_primary_10_1088_1742_6596_1914_1_012003 crossref_primary_10_1016_j_neunet_2022_08_034 crossref_primary_10_1007_s11263_024_02021_x crossref_primary_10_1016_j_patcog_2022_109270 crossref_primary_10_1117_1_JRS_17_032405 crossref_primary_10_1186_s12859_018_2517_3 crossref_primary_10_1109_TRS_2025_3580543 crossref_primary_10_1007_s11042_018_6245_5 crossref_primary_10_1109_TGRS_2017_2689071 crossref_primary_10_1109_TNNLS_2022_3184821 crossref_primary_10_1016_j_ins_2022_07_096 crossref_primary_10_1109_TMM_2019_2959433 crossref_primary_10_1109_TNNLS_2024_3386935 crossref_primary_10_1007_s10489_022_03244_6 crossref_primary_10_1007_s11263_025_02422_6 crossref_primary_10_1109_TNNLS_2022_3142181 crossref_primary_10_1016_j_neucom_2023_126873 crossref_primary_10_1109_ACCESS_2019_2958052 crossref_primary_10_1007_s13042_020_01170_y crossref_primary_10_1049_iet_cvi_2017_0438 crossref_primary_10_1109_TIP_2014_2327805 crossref_primary_10_1016_j_engappai_2022_105012 crossref_primary_10_1016_j_neucom_2020_01_078 crossref_primary_10_1109_TASE_2018_2876611 crossref_primary_10_1109_TAI_2023_3308555 crossref_primary_10_1109_TIP_2021_3135480 crossref_primary_10_1007_s10489_022_03257_1 crossref_primary_10_1016_j_neucom_2021_01_037 crossref_primary_10_1109_TMM_2022_3145237 crossref_primary_10_1007_s10489_023_04808_w crossref_primary_10_1007_s41745_019_0099_3 crossref_primary_10_3390_s18082485 crossref_primary_10_1016_j_engappai_2025_111692 crossref_primary_10_1007_s12559_019_09629_z crossref_primary_10_1109_ACCESS_2019_2925383 crossref_primary_10_1109_JSTARS_2021_3132189 crossref_primary_10_3390_a9010017 crossref_primary_10_1016_j_inffus_2023_101917 crossref_primary_10_1109_ACCESS_2023_3343759 crossref_primary_10_1016_j_eswa_2015_07_049 crossref_primary_10_1016_j_ipm_2023_103479 crossref_primary_10_1109_TIP_2018_2831454 crossref_primary_10_1016_j_ins_2020_01_025 crossref_primary_10_1016_j_jvcir_2018_12_041 crossref_primary_10_1016_j_sigpro_2019_05_022 crossref_primary_10_1007_s11263_020_01340_z crossref_primary_10_1016_j_jtice_2024_105784 crossref_primary_10_1109_TCDS_2016_2632178 crossref_primary_10_1109_TMM_2020_3047546 crossref_primary_10_1109_TNNLS_2020_3026876 crossref_primary_10_1109_ACCESS_2019_2935175 crossref_primary_10_1016_j_neunet_2021_07_031 crossref_primary_10_1109_TIM_2025_3553970 crossref_primary_10_1016_j_patrec_2022_02_002 crossref_primary_10_1002_int_22811 crossref_primary_10_1109_TIP_2018_2813530 crossref_primary_10_1109_TNNLS_2024_3350715 crossref_primary_10_1109_TCSVT_2019_2908487 crossref_primary_10_1109_TIP_2023_3295738 crossref_primary_10_1016_j_neunet_2025_107997 crossref_primary_10_1016_j_ress_2023_109591 crossref_primary_10_1016_j_inffus_2025_103124 crossref_primary_10_1109_TPAMI_2022_3163667 crossref_primary_10_1016_j_neunet_2024_106170 crossref_primary_10_1109_TIP_2019_2932502 crossref_primary_10_1109_TPAMI_2019_2921960 crossref_primary_10_1016_j_patcog_2024_110451 crossref_primary_10_1162_neco_a_01256 crossref_primary_10_1016_j_compmedimag_2016_01_001 crossref_primary_10_1016_j_image_2023_116955 crossref_primary_10_1016_j_isatra_2024_12_026 crossref_primary_10_1109_TNNLS_2019_2904991 crossref_primary_10_1109_TPAMI_2022_3144984 crossref_primary_10_3169_mta_4_209 crossref_primary_10_1109_TCSVT_2025_3529937 crossref_primary_10_1016_j_neucom_2023_126320 crossref_primary_10_1016_j_neunet_2021_05_019 crossref_primary_10_1109_TCDS_2020_3044313 crossref_primary_10_1007_s10994_022_06196_7 crossref_primary_10_1016_j_psep_2023_07_080 crossref_primary_10_1016_j_robot_2018_03_002 crossref_primary_10_1109_TPAMI_2019_2893215 crossref_primary_10_18267_j_aip_288 crossref_primary_10_1016_j_robot_2019_05_013 crossref_primary_10_1016_j_eswa_2025_126488 crossref_primary_10_1049_ipr2_13113 crossref_primary_10_1007_s11704_019_8249_3 crossref_primary_10_1016_j_neucom_2019_09_062 crossref_primary_10_1016_j_robot_2024_104688 crossref_primary_10_1049_ipr2_12147 crossref_primary_10_1109_TKDE_2016_2545658 crossref_primary_10_1080_01691864_2016_1164621 crossref_primary_10_3233_SW_212959 crossref_primary_10_1016_j_artmed_2024_102949 crossref_primary_10_1016_j_knosys_2020_105651 crossref_primary_10_1109_LSP_2020_2968213 crossref_primary_10_1016_j_cogsys_2018_12_003 crossref_primary_10_1109_JIOT_2024_3511592 crossref_primary_10_1109_TCSVT_2021_3067067 crossref_primary_10_1109_TMM_2022_3145666 crossref_primary_10_1109_TPAMI_2017_2693987 crossref_primary_10_1109_TPAMI_2016_2636827 crossref_primary_10_1109_TNNLS_2017_2728060 crossref_primary_10_1109_ACCESS_2020_2971174 crossref_primary_10_1109_TCYB_2018_2804326 crossref_primary_10_1109_TETCI_2018_2868061 crossref_primary_10_1016_j_patcog_2021_107953 crossref_primary_10_1109_TMM_2020_2993952 crossref_primary_10_1016_j_eswa_2024_126192 crossref_primary_10_1109_TIP_2020_2975980 crossref_primary_10_1109_TPAMI_2020_2965531 crossref_primary_10_3233_JIFS_191292 crossref_primary_10_1007_s00521_023_08299_1 crossref_primary_10_1109_TPAMI_2020_2965534 crossref_primary_10_1145_3603147 crossref_primary_10_1016_j_ipm_2022_103233 crossref_primary_10_1109_TIP_2020_2986892 crossref_primary_10_1109_TPAMI_2023_3273712 crossref_primary_10_1145_3419972 crossref_primary_10_1109_ACCESS_2020_2964613 crossref_primary_10_3390_rs16101653 crossref_primary_10_1016_j_patcog_2019_107003 crossref_primary_10_1016_j_image_2022_116897 crossref_primary_10_1109_ACCESS_2019_2938349 crossref_primary_10_1109_ACCESS_2022_3175873 crossref_primary_10_3103_S0735272723030020 crossref_primary_10_1016_j_neucom_2017_12_068 crossref_primary_10_1109_TIP_2024_3451931 crossref_primary_10_1109_TAI_2024_3524955 crossref_primary_10_1109_TII_2024_3514090 crossref_primary_10_1109_TCSVT_2023_3324648 crossref_primary_10_1063_5_0093439 crossref_primary_10_1109_TIP_2023_3293769 crossref_primary_10_1109_ACCESS_2024_3414651 crossref_primary_10_3390_math11163578 crossref_primary_10_1109_TNNLS_2021_3083367 crossref_primary_10_1109_TPAMI_2017_2737007 crossref_primary_10_1016_j_patcog_2024_110561 crossref_primary_10_1145_3379344 crossref_primary_10_1109_TAFFC_2016_2622690 crossref_primary_10_1016_j_knosys_2020_105796 crossref_primary_10_3390_math11040820 crossref_primary_10_3233_SW_210435 crossref_primary_10_1007_s00500_018_3051_y crossref_primary_10_1051_matecconf_201927702028 crossref_primary_10_1007_s11633_019_1177_8 crossref_primary_10_1007_s13042_022_01509_7 crossref_primary_10_1007_s00521_022_07413_z crossref_primary_10_1016_j_neucom_2020_02_077 crossref_primary_10_1016_j_cose_2024_103849 crossref_primary_10_1109_TCYB_2020_3004641 crossref_primary_10_1155_2022_9710667 crossref_primary_10_1109_TMM_2019_2944241 crossref_primary_10_1007_s11042_018_5879_7 crossref_primary_10_1016_j_procs_2021_08_143 crossref_primary_10_1109_JIOT_2023_3243401 crossref_primary_10_1016_j_patcog_2023_110133 crossref_primary_10_1145_3176647 crossref_primary_10_1186_s12938_021_00968_3 crossref_primary_10_1016_j_cviu_2022_103454 crossref_primary_10_1145_3451884 crossref_primary_10_1002_widm_1488 crossref_primary_10_1016_j_cogsys_2023_101186 crossref_primary_10_1109_TMM_2023_3264847 crossref_primary_10_1155_2020_8875910 crossref_primary_10_1016_j_neucom_2016_04_056 crossref_primary_10_1049_iet_cvi_2015_0131 crossref_primary_10_1109_TII_2024_3405634 crossref_primary_10_1016_j_aei_2024_102515 crossref_primary_10_1109_TIP_2021_3100552 crossref_primary_10_1016_j_neucom_2019_08_111 crossref_primary_10_1109_TPAMI_2021_3127346 crossref_primary_10_1016_j_ins_2021_09_032 crossref_primary_10_1007_s00371_022_02470_w crossref_primary_10_1109_TIM_2023_3289549 crossref_primary_10_1587_transinf_2015EDP7226 crossref_primary_10_1007_s13042_024_02512_w crossref_primary_10_1016_j_patcog_2020_107488 crossref_primary_10_1016_j_knosys_2024_112848 crossref_primary_10_1016_j_neucom_2020_05_043 crossref_primary_10_1109_ACCESS_2019_2919073 crossref_primary_10_1016_j_engappai_2022_105309 crossref_primary_10_1016_j_patcog_2020_107370 crossref_primary_10_1007_s11263_019_01232_x crossref_primary_10_1016_j_patrec_2017_09_030 crossref_primary_10_1109_TPAMI_2020_3045530 crossref_primary_10_1109_TPAMI_2021_3130759 crossref_primary_10_1109_TCSVT_2022_3209209 crossref_primary_10_1016_j_ins_2024_120374 crossref_primary_10_1109_TIP_2018_2872916 crossref_primary_10_1049_el_2018_5027 crossref_primary_10_1016_j_neucom_2022_02_056 crossref_primary_10_1109_TMM_2022_3190678 crossref_primary_10_1109_TNNLS_2021_3132366 crossref_primary_10_1029_2019PA003612 crossref_primary_10_1109_TGRS_2017_2754648 crossref_primary_10_1109_TMM_2022_3154592 crossref_primary_10_1016_j_neucom_2020_12_127 crossref_primary_10_3390_math10193587 crossref_primary_10_1016_j_patcog_2018_02_028 crossref_primary_10_1109_TMM_2022_3200578 crossref_primary_10_1109_TPAMI_2016_2645157 crossref_primary_10_1109_TSMC_2018_2791603 crossref_primary_10_1109_TMM_2021_3139211 crossref_primary_10_1016_j_ins_2023_120052 crossref_primary_10_1109_ACCESS_2020_3048756 crossref_primary_10_1016_j_knosys_2021_107780 crossref_primary_10_1109_ACCESS_2020_3046573 crossref_primary_10_1016_j_engappai_2024_109020 crossref_primary_10_1016_j_eswa_2021_116268 crossref_primary_10_1016_j_neucom_2023_126264 crossref_primary_10_1016_j_neunet_2020_09_010 crossref_primary_10_1007_s00521_023_08915_0 crossref_primary_10_1007_s13042_024_02381_3 crossref_primary_10_1109_TIM_2025_3548779 crossref_primary_10_1109_TIP_2020_2964429 crossref_primary_10_1016_j_neucom_2022_09_070 crossref_primary_10_1109_ACCESS_2018_2886597 crossref_primary_10_1109_TNNLS_2023_3242473 crossref_primary_10_1016_j_neucom_2020_12_017 crossref_primary_10_1109_TNNLS_2017_2761401 crossref_primary_10_3390_s23042311 crossref_primary_10_1109_TCSVT_2022_3201822 crossref_primary_10_1109_TIP_2024_3374217 crossref_primary_10_1016_j_aei_2023_102204 crossref_primary_10_1109_TCSS_2021_3130401 crossref_primary_10_1016_j_neunet_2021_04_014 crossref_primary_10_1109_TPAMI_2017_2707495 crossref_primary_10_1007_s00530_024_01373_1 crossref_primary_10_1109_TCYB_2018_2831457 crossref_primary_10_1016_j_image_2019_02_011 crossref_primary_10_1016_j_imavis_2020_104077 crossref_primary_10_1109_TSMC_2021_3102834 crossref_primary_10_1007_s10489_021_03101_y crossref_primary_10_1016_j_infsof_2020_106472 crossref_primary_10_1016_j_cviu_2023_103811 crossref_primary_10_1109_TMM_2019_2919469 crossref_primary_10_1016_j_neucom_2025_130264 crossref_primary_10_1007_s11263_022_01650_4 crossref_primary_10_1145_3424341 crossref_primary_10_1109_TCYB_2021_3110369 crossref_primary_10_3390_recycling10040144 crossref_primary_10_1007_s11042_018_6842_3 crossref_primary_10_1016_j_neunet_2020_11_007 crossref_primary_10_1007_s10489_023_04686_2 crossref_primary_10_3390_s25175499 crossref_primary_10_1109_LSP_2016_2612247 crossref_primary_10_1109_TCSVT_2019_2899569 crossref_primary_10_1007_s11432_023_4051_9 crossref_primary_10_1016_j_patcog_2018_03_006 crossref_primary_10_1109_TBME_2016_2604856 crossref_primary_10_1109_TNNLS_2024_3499377 crossref_primary_10_1177_0956797616663878 crossref_primary_10_3390_app122412642 crossref_primary_10_1002_asi_23827 crossref_primary_10_1007_s12524_023_01734_9 crossref_primary_10_1016_j_eswa_2023_123133 crossref_primary_10_1109_TIP_2019_2933728 crossref_primary_10_1109_TSG_2024_3390441 crossref_primary_10_1016_j_knosys_2020_105490 crossref_primary_10_1016_j_inffus_2025_103633 crossref_primary_10_1016_j_patrec_2022_05_009 crossref_primary_10_1109_TMM_2024_3408048 crossref_primary_10_1007_s10489_022_03869_7 crossref_primary_10_1109_ACCESS_2020_3000347 crossref_primary_10_1109_TNNLS_2022_3202014 crossref_primary_10_1109_TMM_2020_2984091 crossref_primary_10_1016_j_neunet_2024_106227 crossref_primary_10_1016_j_patrec_2018_10_022 crossref_primary_10_1109_TMI_2022_3163232 crossref_primary_10_1007_s11280_018_0642_6 crossref_primary_10_1109_TNNLS_2020_3027588 crossref_primary_10_1016_j_jvcir_2020_102871 crossref_primary_10_1109_TETCI_2018_2872036 crossref_primary_10_1109_TMM_2020_3033124 crossref_primary_10_1109_TSMC_2018_2818184 crossref_primary_10_1016_j_jprocont_2024_103297 crossref_primary_10_1016_j_image_2020_115920 crossref_primary_10_1109_TIP_2021_3122003 crossref_primary_10_1109_TIP_2019_2902115 crossref_primary_10_1016_j_measurement_2023_113236 crossref_primary_10_1016_j_patcog_2020_107263 crossref_primary_10_1109_TPAMI_2020_2987013 crossref_primary_10_1109_TPAMI_2025_3566169 crossref_primary_10_1162_neco_a_01639 crossref_primary_10_1016_j_adhoc_2020_102380 crossref_primary_10_1016_j_knosys_2020_106378 crossref_primary_10_1016_j_knosys_2021_106773 crossref_primary_10_1016_j_neucom_2023_03_007 crossref_primary_10_3390_sym17081291 crossref_primary_10_1109_TIP_2024_3351439 crossref_primary_10_1109_TITS_2022_3215572 crossref_primary_10_1109_TMM_2021_3087098 crossref_primary_10_1016_j_jvcir_2020_102981 crossref_primary_10_1016_j_patcog_2021_108435 crossref_primary_10_1016_j_jprocont_2024_103267 crossref_primary_10_1016_j_imavis_2025_105490 crossref_primary_10_1088_1361_6501_ad99f3 crossref_primary_10_1109_TIP_2016_2602078 crossref_primary_10_1109_TPAMI_2023_3346869 crossref_primary_10_1007_s11042_020_09316_4 crossref_primary_10_1109_TPAMI_2020_3037734 crossref_primary_10_1109_TPAMI_2018_2836461 crossref_primary_10_1109_TIP_2018_2869696 crossref_primary_10_1093_bioinformatics_btae387 crossref_primary_10_1007_s11263_020_01342_x crossref_primary_10_1007_s42979_021_00648_y crossref_primary_10_1007_s00521_021_05746_9 crossref_primary_10_1016_j_neucom_2022_01_007 crossref_primary_10_1016_j_patrec_2024_09_007 crossref_primary_10_1109_TCYB_2019_2915789 crossref_primary_10_1109_TWC_2023_3323380 crossref_primary_10_1007_s00521_025_11005_y crossref_primary_10_3390_rs14215516 crossref_primary_10_1145_3461646 crossref_primary_10_1016_j_patrec_2018_04_011 crossref_primary_10_1016_j_enbuild_2025_116159 crossref_primary_10_3389_fncom_2015_00008 crossref_primary_10_1016_j_ifacol_2022_09_250 crossref_primary_10_1109_THMS_2014_2358649 crossref_primary_10_1016_j_neunet_2024_106324 crossref_primary_10_1007_s10032_023_00447_6 crossref_primary_10_1016_j_imavis_2020_103924 crossref_primary_10_1016_j_neucom_2018_10_043 crossref_primary_10_1109_TNSRE_2023_3235804 crossref_primary_10_1109_TMM_2024_3353457 crossref_primary_10_1016_j_jvcir_2022_103657 crossref_primary_10_1109_TNNLS_2021_3105377 crossref_primary_10_1007_s00530_020_00660_x crossref_primary_10_1049_el_2020_1594 crossref_primary_10_1007_s11263_016_0983_5 crossref_primary_10_3390_rs10081307 crossref_primary_10_1016_j_neucom_2025_130213 crossref_primary_10_1007_s40430_022_03965_2 crossref_primary_10_1017_eds_2024_53 crossref_primary_10_1109_TPAMI_2019_2922175 crossref_primary_10_1109_TAES_2025_3534140 crossref_primary_10_3390_s22010134 crossref_primary_10_1016_j_ins_2021_08_061 crossref_primary_10_1080_03081079_2023_2199991 crossref_primary_10_1109_TITS_2021_3127632 crossref_primary_10_1109_TMM_2019_2909860 crossref_primary_10_1016_j_eswa_2023_123062 crossref_primary_10_1109_TMM_2015_2463218 crossref_primary_10_1109_TCDS_2021_3049274 crossref_primary_10_1016_j_ins_2023_119135 crossref_primary_10_1145_3728637 crossref_primary_10_1016_j_neunet_2021_02_009 crossref_primary_10_1007_s11704_023_3396_y crossref_primary_10_1109_TMM_2021_3089017 crossref_primary_10_1109_TPAMI_2022_3148853 crossref_primary_10_1016_j_patcog_2024_110824 crossref_primary_10_1038_s41598_025_13612_0 crossref_primary_10_20965_jaciii_2019_p0705 crossref_primary_10_1016_j_compag_2024_109880 crossref_primary_10_1109_TMI_2024_3429471 crossref_primary_10_1016_j_trit_2017_01_001 crossref_primary_10_1016_j_cviu_2015_02_006 crossref_primary_10_1016_j_imavis_2023_104630 crossref_primary_10_1109_TCSVT_2020_2965966 crossref_primary_10_1016_j_neucom_2015_07_155 crossref_primary_10_1002_pra2_778 crossref_primary_10_1007_s00138_022_01361_3 crossref_primary_10_1016_j_apacoust_2020_107691 crossref_primary_10_1109_TNSM_2022_3183247 crossref_primary_10_1016_j_neucom_2021_08_027 crossref_primary_10_1016_j_patrec_2023_01_005 crossref_primary_10_1016_j_patrec_2016_09_015 crossref_primary_10_1109_TIP_2024_3403053 crossref_primary_10_1016_j_neucom_2019_04_070 crossref_primary_10_1016_j_neunet_2021_02_015 crossref_primary_10_1109_TIP_2020_2991527 crossref_primary_10_1109_TPAMI_2024_3487631 crossref_primary_10_1109_TIP_2018_2862625 crossref_primary_10_1007_s00521_020_05322_7 crossref_primary_10_1109_TIP_2019_2912126 crossref_primary_10_1016_j_neunet_2019_08_023 crossref_primary_10_1051_jnwpu_20193761271 crossref_primary_10_1016_j_patcog_2021_108237 crossref_primary_10_1080_01431161_2021_1996655 crossref_primary_10_1109_TCSVT_2024_3486074 crossref_primary_10_1016_j_imavis_2016_05_004 crossref_primary_10_1007_s11263_023_01767_0 crossref_primary_10_1016_j_inffus_2024_102605 crossref_primary_10_1145_3514247 crossref_primary_10_1007_s11263_016_0896_3 crossref_primary_10_1109_TSMC_2019_2912206 crossref_primary_10_1109_TKDE_2024_3487907 crossref_primary_10_1016_j_patcog_2022_108642 crossref_primary_10_1109_TNNLS_2018_2844399 crossref_primary_10_1109_TNNLS_2017_2677441 crossref_primary_10_1007_s10044_022_01109_9 crossref_primary_10_1007_s11042_016_3309_2 crossref_primary_10_1109_TCYB_2017_2742705 crossref_primary_10_1016_j_ecoinf_2021_101222 crossref_primary_10_1007_s11263_025_02384_9 crossref_primary_10_1016_j_neucom_2023_127188 crossref_primary_10_1109_TIP_2019_2910052 crossref_primary_10_1016_j_patcog_2021_108469 crossref_primary_10_1109_ACCESS_2020_2998495 crossref_primary_10_1007_s10514_021_10009_6 crossref_primary_10_1109_TII_2024_3475429 crossref_primary_10_1016_j_neunet_2023_03_003 crossref_primary_10_1109_TCSVT_2024_3396215 crossref_primary_10_1109_TSMC_2025_3549235 crossref_primary_10_1016_j_jag_2023_103569 crossref_primary_10_1109_TMM_2021_3110098 crossref_primary_10_1109_TIP_2024_3360899 crossref_primary_10_1007_s11263_020_01355_6 crossref_primary_10_1088_1742_6596_2010_1_012116 crossref_primary_10_3390_rs13091672 crossref_primary_10_3390_app15116338 crossref_primary_10_3390_math13030412 crossref_primary_10_1007_s00530_024_01273_4 crossref_primary_10_1109_TIP_2017_2696747 crossref_primary_10_1109_ACCESS_2022_3220906 crossref_primary_10_1016_j_neucom_2017_01_038 crossref_primary_10_1016_j_asoc_2021_107352 crossref_primary_10_1016_j_patrec_2020_04_029 crossref_primary_10_3390_info10080245 crossref_primary_10_1007_s11042_023_16316_7 crossref_primary_10_1007_s11263_021_01451_1 crossref_primary_10_1016_j_inffus_2023_102021 crossref_primary_10_1109_TMI_2021_3131245 crossref_primary_10_1016_j_knosys_2021_107337 crossref_primary_10_1109_TMM_2022_3142958 crossref_primary_10_1109_TMM_2022_3222657 crossref_primary_10_1109_TASLP_2021_3065234 crossref_primary_10_1007_s11263_017_1027_5 crossref_primary_10_1109_TDEI_2024_3469913 crossref_primary_10_1109_TMECH_2017_2775208 crossref_primary_10_1109_ACCESS_2019_2961933 crossref_primary_10_1109_TNNLS_2022_3155602 crossref_primary_10_1016_j_engappai_2025_112177 crossref_primary_10_1109_TMM_2024_3521852 crossref_primary_10_1016_j_ins_2024_120572 crossref_primary_10_1007_s00138_016_0763_9 crossref_primary_10_1016_j_inffus_2025_102934 crossref_primary_10_1109_TPAMI_2022_3229526 crossref_primary_10_1109_TPAMI_2021_3140070 crossref_primary_10_3390_app9153133 crossref_primary_10_1016_j_patrec_2021_06_020 crossref_primary_10_1109_TNNLS_2023_3349142 crossref_primary_10_1109_TNSRE_2020_3027004 crossref_primary_10_1109_TPAMI_2022_3177295 crossref_primary_10_1007_s00138_023_01470_7 crossref_primary_10_3390_rs17010134 crossref_primary_10_1109_TIP_2016_2601260 crossref_primary_10_1007_s00521_024_09724_9 crossref_primary_10_1016_j_neucom_2018_10_069 crossref_primary_10_1049_joe_2018_8292 crossref_primary_10_1109_TCYB_2019_2930744 crossref_primary_10_1109_TPAMI_2024_3395778 crossref_primary_10_1109_TCSVT_2024_3422491 crossref_primary_10_1109_TPAMI_2023_3277881 crossref_primary_10_1109_TPAMI_2022_3143074 crossref_primary_10_1016_j_neucom_2019_03_100 crossref_primary_10_1109_TIP_2021_3050677 crossref_primary_10_1109_TCYB_2017_2751741 crossref_primary_10_1109_TMM_2021_3125134 crossref_primary_10_1016_j_ins_2020_12_063 crossref_primary_10_1016_j_neucom_2025_129530 crossref_primary_10_1016_j_imavis_2022_104489 crossref_primary_10_1109_TIP_2019_2899987 crossref_primary_10_1016_j_ins_2023_119399 crossref_primary_10_1109_TIP_2018_2861573 crossref_primary_10_1016_j_aei_2022_101813 crossref_primary_10_1016_j_patcog_2021_108024 crossref_primary_10_1016_j_patcog_2022_108563 crossref_primary_10_1016_j_aei_2025_103885 crossref_primary_10_1016_j_eswa_2021_116197 crossref_primary_10_1109_TPAMI_2017_2723882 crossref_primary_10_1016_j_neunet_2024_106964 crossref_primary_10_1109_TMM_2022_3155928 crossref_primary_10_1109_TCSVT_2016_2539604 crossref_primary_10_1109_TPAMI_2022_3191696 crossref_primary_10_1016_j_knosys_2022_109949 crossref_primary_10_3390_ijgi13120422 crossref_primary_10_1109_TCSVT_2022_3208256 crossref_primary_10_1109_TII_2024_3359460 |
| Cites_doi | 10.1109/TPAMI.2007.1055 10.7551/mitpress/1602.001.0001 10.1561/0600000027 10.1109/CVPR.2000.855856 10.1007/978-3-642-15549-9_48 10.5244/C.22.80 10.1109/CVPR.2012.6248021 10.1145/1178677.1178722 10.1109/TPAMI.2006.79 10.1613/jair.105 10.7551/mitpress/1113.003.0008 10.1109/CVPR.2010.5539970 10.1109/TPAMI.2011.48 10.1006/jcss.1997.1504 10.1006/cogp.1993.1006 10.1162/neco.1994.6.2.181 10.1109/CVPR.2007.383198 10.1109/ICCV.2005.148 10.21236/ada164453 10.1109/CVPR.2008.4587658 10.1109/ICCV.2011.6126281 10.1109/CVPR.2005.117 10.1109/CVPR.2012.6248026 10.1007/978-3-642-15549-9_56 10.1109/CVPR.2003.1211479 10.1109/TIP.2009.2019809 10.1007/978-3-642-33712-3_26 10.1109/CVPR.2011.5995368 10.7551/mitpress/4175.001.0001 10.1109/CVPR.2007.383157 10.1109/CVPR.2009.5206772 10.1007/BF01079209 10.5244/C.23.2 10.1007/978-3-642-33715-4_18 10.1109/CVPR.2011.5995353 10.1109/CVPR.2012.6247998 10.1109/CVPR.2005.177 10.1038/nn1794 10.1016/j.cviu.2007.09.014 10.1109/CVPR.2011.5995466 10.1109/TPAMI.2012.256 10.1109/CVPR.2011.5995451 10.1109/CVPR.2009.5206594 10.2307/2325085 10.1207/s15516709cog1502_3 10.1109/T-C.1973.223602 10.1007/978-3-540-88693-8_25 10.1007/978-3-642-15555-0_12 10.1109/CVPR.2010.5540121 10.1145/1101149.1101241 10.1037/0033-295X.94.2.115 10.1007/978-3-642-37331-2_1 10.1109/CVPR.2008.4587597 10.1023/b:visi.0000029664.99615.94 10.1145/1991996.1992014 10.7551/mitpress/8291.001.0001 10.1145/1282280.1282340 10.1007/978-3-642-15555-0_10 10.1109/CVPR.2010.5540053 10.1007/978-3-642-35749-7_5 |
| ContentType | Journal Article |
| Copyright | 2015 INIST-CNRS Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Mar 2014 |
| Copyright_xml | – notice: 2015 INIST-CNRS – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Mar 2014 |
| DBID | 97E RIA RIE AAYXX CITATION IQODW CGR CUY CVF ECM EIF NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 F28 FR3 |
| DOI | 10.1109/TPAMI.2013.140 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Pascal-Francis Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic ANTE: Abstracts in New Technology & Engineering Engineering Research Database |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic Engineering Research Database ANTE: Abstracts in New Technology & Engineering |
| DatabaseTitleList | MEDLINE - Academic MEDLINE Technology Research Database Technology Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science Applied Sciences |
| EISSN | 2160-9292 1939-3539 |
| EndPage | 465 |
| ExternalDocumentID | 3238372951 24457503 28402986 10_1109_TPAMI_2013_140 6571196 |
| Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
| GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E 9M8 AAJGR AARMG AASAJ AAWTH ABAZT ABFSI ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT ADRHT AENEX AETEA AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P FA8 HZ~ H~9 IBMZZ ICLAB IEDLZ IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNI RNS RXW RZB TAE TN5 UHB VH1 XJT ~02 AAYXX CITATION AAYOK IQODW RIG CGR CUY CVF ECM EIF NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 F28 FR3 |
| ID | FETCH-LOGICAL-c402t-514241ca74b45da586d2ab7ab832aa08d1fd83ef30fd9045e4901bd1428acbfa3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 1261 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000331450100005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0162-8828 1939-3539 |
| IngestDate | Sun Sep 28 09:03:28 EDT 2025 Sat Sep 27 21:11:52 EDT 2025 Mon Jun 30 03:36:39 EDT 2025 Mon Jul 21 05:55:04 EDT 2025 Wed Apr 02 07:27:33 EDT 2025 Sat Nov 29 08:07:25 EST 2025 Tue Nov 18 20:53:11 EST 2025 Tue Aug 26 16:49:19 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 3 |
| Keywords | Computer vision Zero-shot learning Image interpretation Image databank Pattern recognition Object recognition Annotation Semantics Scene analysis Animal vision and scene understanding Object oriented Collection Categorization Image classification |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html CC BY 4.0 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c402t-514241ca74b45da586d2ab7ab832aa08d1fd83ef30fd9045e4901bd1428acbfa3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 ObjectType-Article-2 ObjectType-Feature-1 |
| PMID | 24457503 |
| PQID | 1504635202 |
| PQPubID | 85458 |
| PageCount | 13 |
| ParticipantIDs | proquest_journals_1504635202 pubmed_primary_24457503 ieee_primary_6571196 pascalfrancis_primary_28402986 crossref_primary_10_1109_TPAMI_2013_140 proquest_miscellaneous_1492705297 proquest_miscellaneous_1520950795 crossref_citationtrail_10_1109_TPAMI_2013_140 |
| PublicationCentury | 2000 |
| PublicationDate | 2014-03-01 |
| PublicationDateYYYYMMDD | 2014-03-01 |
| PublicationDate_xml | – month: 03 year: 2014 text: 2014-03-01 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | Los Alamitos, CA |
| PublicationPlace_xml | – name: Los Alamitos, CA – name: United States – name: New York |
| PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
| PublicationTitleAbbrev | TPAMI |
| PublicationTitleAlternate | IEEE Trans Pattern Anal Mach Intell |
| PublicationYear | 2014 |
| Publisher | IEEE IEEE Computer Society The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: IEEE Computer Society – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref12 ref56 ref15 ref59 ref14 ref58 ref53 ref52 ref11 ref55 ref10 ref54 ref16 ref19 ref18 Platt (ref23) Kemp (ref57) ref51 ref50 Ferrari (ref32) ref46 ref45 ref48 ref47 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 Breiman (ref13) 1984 ref35 ref34 ref37 ref36 ref31 ref30 ref33 ref2 ref1 ref39 ref38 Rifkin (ref17) 2004; 5 ref24 ref67 ref26 ref25 ref20 ref64 ref63 ref22 ref66 ref21 ref65 ref28 Larochelle (ref29); 1 ref27 ref60 ref62 ref61 |
| References_xml | – ident: ref24 doi: 10.1109/TPAMI.2007.1055 – ident: ref8 doi: 10.7551/mitpress/1602.001.0001 – ident: ref4 doi: 10.1561/0600000027 – volume-title: Proc. Advances in Neural Information Processing Systems (NIPS) ident: ref23 article-title: Large Margin DAGs for Multiclass Classification – ident: ref26 doi: 10.1109/CVPR.2000.855856 – ident: ref40 doi: 10.1007/978-3-642-15549-9_48 – ident: ref25 doi: 10.5244/C.22.80 – ident: ref44 doi: 10.1109/CVPR.2012.6248021 – volume-title: Proc. Advances in Neural Information Processing Systems (NIPS) ident: ref32 article-title: Learning Visual Attributes – ident: ref33 doi: 10.1145/1178677.1178722 – ident: ref27 doi: 10.1109/TPAMI.2006.79 – ident: ref16 doi: 10.1613/jair.105 – ident: ref63 doi: 10.7551/mitpress/1113.003.0008 – ident: ref62 doi: 10.1109/CVPR.2010.5539970 – volume-title: Classification and Regression Trees year: 1984 ident: ref13 – ident: ref35 doi: 10.1109/TPAMI.2011.48 – ident: ref15 doi: 10.1006/jcss.1997.1504 – ident: ref54 doi: 10.1006/cogp.1993.1006 – ident: ref14 doi: 10.1162/neco.1994.6.2.181 – ident: ref61 doi: 10.1109/CVPR.2007.383198 – ident: ref19 doi: 10.1109/ICCV.2005.148 – ident: ref12 doi: 10.21236/ada164453 – volume: 5 start-page: 101 year: 2004 ident: ref17 article-title: In Defense of One-vs -All Classification publication-title: J. Machine Learning Research – ident: ref58 doi: 10.1109/CVPR.2008.4587658 – volume-title: Proc. Nat’l Conf. Artificial Intelligence (AAAI) ident: ref57 article-title: Learning Systems of Concepts with an Infinite Relational Model – ident: ref10 doi: 10.1109/ICCV.2011.6126281 – ident: ref28 doi: 10.1109/CVPR.2005.117 – ident: ref51 doi: 10.1109/CVPR.2012.6248026 – ident: ref64 doi: 10.1007/978-3-642-15549-9_56 – ident: ref21 doi: 10.1109/CVPR.2003.1211479 – ident: ref31 doi: 10.1109/TIP.2009.2019809 – ident: ref52 doi: 10.1007/978-3-642-33712-3_26 – ident: ref7 doi: 10.1109/CVPR.2011.5995368 – ident: ref3 doi: 10.7551/mitpress/4175.001.0001 – ident: ref18 doi: 10.1109/CVPR.2007.383157 – ident: ref45 doi: 10.1109/CVPR.2009.5206772 – ident: ref53 doi: 10.1007/BF01079209 – ident: ref39 doi: 10.5244/C.23.2 – ident: ref37 doi: 10.1007/978-3-642-33715-4_18 – ident: ref47 doi: 10.1109/CVPR.2011.5995353 – ident: ref46 doi: 10.1109/CVPR.2012.6247998 – ident: ref2 doi: 10.1109/CVPR.2005.177 – ident: ref55 doi: 10.1038/nn1794 – ident: ref60 doi: 10.1016/j.cviu.2007.09.014 – ident: ref50 doi: 10.1109/CVPR.2011.5995466 – ident: ref66 doi: 10.1109/TPAMI.2012.256 – ident: ref36 doi: 10.1109/CVPR.2011.5995451 – ident: ref9 doi: 10.1109/CVPR.2009.5206594 – ident: ref11 doi: 10.2307/2325085 – ident: ref56 doi: 10.1207/s15516709cog1502_3 – ident: ref20 doi: 10.1109/T-C.1973.223602 – ident: ref34 doi: 10.1007/978-3-540-88693-8_25 – ident: ref42 doi: 10.1007/978-3-642-15555-0_12 – ident: ref49 doi: 10.1109/CVPR.2010.5540121 – ident: ref38 doi: 10.1145/1101149.1101241 – ident: ref6 doi: 10.1037/0033-295X.94.2.115 – ident: ref67 doi: 10.1007/978-3-642-37331-2_1 – ident: ref22 doi: 10.1109/CVPR.2008.4587597 – ident: ref1 doi: 10.1023/b:visi.0000029664.99615.94 – ident: ref48 doi: 10.1145/1991996.1992014 – ident: ref5 doi: 10.7551/mitpress/8291.001.0001 – ident: ref30 doi: 10.1145/1101149.1101241 – ident: ref59 doi: 10.1145/1282280.1282340 – ident: ref43 doi: 10.1007/978-3-642-15555-0_10 – volume: 1 start-page: 646 issue: 2 volume-title: Proc. 23rd Nat’l Conf. Artificial Intelligence ident: ref29 article-title: Zero-Data Learning of New Tasks – ident: ref65 doi: 10.1109/CVPR.2010.5540053 – ident: ref41 doi: 10.1007/978-3-642-35749-7_5 |
| SSID | ssj0014503 |
| Score | 2.6471958 |
| Snippet | We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This... |
| SourceID | proquest pubmed pascalfrancis crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 453 |
| SubjectTerms | Animals Applied sciences Artificial intelligence Classification Classification - methods Computer science; control theory; systems Computer vision Databases, Factual Exact sciences and technology Image Processing, Computer-Assisted - methods Learning Marine animals Models, Statistical Object recognition Pattern Recognition, Automated - methods Pattern recognition. Digital image processing. Computational geometry Probabilistic logic Programming languages Semantics Software Support Vector Machine Tasks Training Vectors vision and scene understanding |
| Title | Attribute-Based Classification for Zero-Shot Visual Object Categorization |
| URI | https://ieeexplore.ieee.org/document/6571196 https://www.ncbi.nlm.nih.gov/pubmed/24457503 https://www.proquest.com/docview/1504635202 https://www.proquest.com/docview/1492705297 https://www.proquest.com/docview/1520950795 |
| Volume | 36 |
| WOSCitedRecordID | wos000331450100005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2160-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014503 issn: 0162-8828 databaseCode: RIE dateStart: 19790101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwEB21FQc4UGj5WCirICFxwTSJ7bVzXCoqeqBUoqAVl8ifaiW0QbtZfj8zjjdQiSJx2ygjxevxZJ4z4_cAXnlbRcxMllklIxNcBGa4xUvudGyCdtwNYhPq_FwvFs3FDrwZz8KEEFLzWXhLP1Mt33duQ5_KjmdSVbhidmFXKTWc1RorBkImFWREMBjhuI3IBI1V2RxfXsw_nlEXF8f3Akm_YUqTVL-7kYuSuAq1Rpo1zk4cZC1ux50p_5zu_9_IH8D9jDOL-bAwHsJOWB7A_lbDocghfQD3_iAkPISzeT8oYAX2DtObL5JmJnUTJQcWiHCLb2HVsc9XXV98vV5v8BmfLH3LKU6Ic6Jb5XOdj-DL6fvLkw8siy0wh1vInkk68lY5g34T0hupZ742VhmLIW9MqX0VveYh8jL6BnFgEIgkrCfCNuNsNPwx7C27ZXgKxSwaY3wkbjwh6qCs0DMtaxeJ_C3yegJsO-2ty0zkJIjxvU07krJpk8da8hhuTcoJvB7tfwwcHLdaHtLcj1Z52icwveHV8T7mZqKgR4OjrZvbHMPrFqGyQDhWlzjgl-NtjD4qqZhl6DZoI5paUbFU_cOGOo0QdjdyAk-GJfR7AHklPvv7wJ_DXfxrYuh5O4K9frUJL-CO-9lfr1dTDIOFnqYw-AXqFQJx |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fa9RAEB5qFdSHVlurV9saQfDFtUl295I8nqWlh-1Z8JTiS9iftCAXucv17-_MZi9asIJvFzKQvZ2dzLeZ2e8DeGd15jEzaaYL6ZngwjHFNV5yU_rKlYabTmyimEzKy8vqYg0-9GdhnHOh-cx9pJ-hlm8bs6RPZYdDWWS4Yh7AQylEnnWntfqagZBBBxkxDMY4biQiRWOWVofTi9H5mPq4OL4ZSPwNk5qkCt6dbBTkVag5Ui1wfnwnbHE_8gwZ6GTz_8b-DDYi0kxG3dJ4DmtutgWbKxWHJAb1Fjz9g5JwG8ajttPAcuwTJjibBNVM6icKLkwQ4yY_3LxhX6-aNvl-vVjiM75o-pqTHBHrRDOPJztfwLeT4-nRKYtyC8zgJrJlkg69ZUah54S0SpZDmytdKI1Br1Ra2szbkjvPU28rRIJOIJbQlijblNFe8R1YnzUz9wqSoVdKWU_seOgoV2hRDkuZG0_0b57nA2Craa9N5CInSYyfddiTpFUdPFaTx3Bzkg7gfW__q2PhuNdym-a-t4rTPoCDO17t72N2JhJ6NNhbubmOUbyoESwLBGR5igN-29_G-KOiipq5Zok2osoLKpcW_7ChXiME3pUcwMtuCf0eQFyJu38f-Bt4fDo9P6vPxpPPr-EJ_k3RdcDtwXo7X7p9eGRu2uvF_CAEwy1k4wTQ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Attribute-based+classification+for+zero-shot+visual+object+categorization&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Lampert%2C+Christoph+H&rft.au=Nickisch%2C+Hannes&rft.au=Harmeling%2C+Stefan&rft.date=2014-03-01&rft.eissn=1939-3539&rft.volume=36&rft.issue=3&rft.spage=453&rft_id=info:doi/10.1109%2FTPAMI.2013.140&rft_id=info%3Apmid%2F24457503&rft.externalDocID=24457503 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |