KITTI-360: A Novel Dataset and Benchmarks for Urban Scene Understanding in 2D and 3D
For the last few decades, several major subfields of artificial intelligence including computer vision, graphics, and robotics have progressed largely independently from each other. Recently, however, the community has realized that progress towards robust intelligent systems such as self-driving ca...
Uložené v:
| Vydané v: | IEEE transactions on pattern analysis and machine intelligence Ročník 45; číslo 3; s. 3292 - 3310 |
|---|---|
| Hlavní autori: | , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
United States
IEEE
01.03.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Predmet: | |
| ISSN: | 0162-8828, 1939-3539, 2160-9292, 1939-3539 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | For the last few decades, several major subfields of artificial intelligence including computer vision, graphics, and robotics have progressed largely independently from each other. Recently, however, the community has realized that progress towards robust intelligent systems such as self-driving cars requires a concerted effort across the different fields. This motivated us to develop KITTI-360, successor of the popular KITTI dataset. KITTI-360 is a suburban driving dataset which comprises richer input modalities, comprehensive semantic instance annotations and accurate localization to facilitate research at the intersection of vision, graphics and robotics. For efficient annotation, we created a tool to label 3D scenes with bounding primitives and developed a model that transfers this information into the 2D image domain, resulting in over 150k images and 1B 3D points with coherent semantic instance annotations across 2D and 3D. Moreover, we established benchmarks and baselines for several tasks relevant to mobile perception, encompassing problems from computer vision, graphics, and robotics on the same dataset, e.g., semantic scene understanding, novel view synthesis and semantic SLAM. KITTI-360 will enable progress at the intersection of these research areas and thus contribute towards solving one of today's grand challenges: the development of fully autonomous self-driving systems. |
|---|---|
| AbstractList | For the last few decades, several major subfields of artificial intelligence including computer vision, graphics, and robotics have progressed largely independently from each other. Recently, however, the community has realized that progress towards robust intelligent systems such as self-driving cars requires a concerted effort across the different fields. This motivated us to develop KITTI-360, successor of the popular KITTI dataset. KITTI-360 is a suburban driving dataset which comprises richer input modalities, comprehensive semantic instance annotations and accurate localization to facilitate research at the intersection of vision, graphics and robotics. For efficient annotation, we created a tool to label 3D scenes with bounding primitives and developed a model that transfers this information into the 2D image domain, resulting in over 150k images and 1B 3D points with coherent semantic instance annotations across 2D and 3D. Moreover, we established benchmarks and baselines for several tasks relevant to mobile perception, encompassing problems from computer vision, graphics, and robotics on the same dataset, e.g., semantic scene understanding, novel view synthesis and semantic SLAM. KITTI-360 will enable progress at the intersection of these research areas and thus contribute towards solving one of today's grand challenges: the development of fully autonomous self-driving systems. For the last few decades, several major subfields of artificial intelligence including computer vision, graphics, and robotics have progressed largely independently from each other. Recently, however, the community has realized that progress towards robust intelligent systems such as self-driving cars requires a concerted effort across the different fields. This motivated us to develop KITTI-360, successor of the popular KITTI dataset. KITTI-360 is a suburban driving dataset which comprises richer input modalities, comprehensive semantic instance annotations and accurate localization to facilitate research at the intersection of vision, graphics and robotics. For efficient annotation, we created a tool to label 3D scenes with bounding primitives and developed a model that transfers this information into the 2D image domain, resulting in over 150k images and 1B 3D points with coherent semantic instance annotations across 2D and 3D. Moreover, we established benchmarks and baselines for several tasks relevant to mobile perception, encompassing problems from computer vision, graphics, and robotics on the same dataset, e.g., semantic scene understanding, novel view synthesis and semantic SLAM. KITTI-360 will enable progress at the intersection of these research areas and thus contribute towards solving one of today's grand challenges: the development of fully autonomous self-driving systems.For the last few decades, several major subfields of artificial intelligence including computer vision, graphics, and robotics have progressed largely independently from each other. Recently, however, the community has realized that progress towards robust intelligent systems such as self-driving cars requires a concerted effort across the different fields. This motivated us to develop KITTI-360, successor of the popular KITTI dataset. KITTI-360 is a suburban driving dataset which comprises richer input modalities, comprehensive semantic instance annotations and accurate localization to facilitate research at the intersection of vision, graphics and robotics. For efficient annotation, we created a tool to label 3D scenes with bounding primitives and developed a model that transfers this information into the 2D image domain, resulting in over 150k images and 1B 3D points with coherent semantic instance annotations across 2D and 3D. Moreover, we established benchmarks and baselines for several tasks relevant to mobile perception, encompassing problems from computer vision, graphics, and robotics on the same dataset, e.g., semantic scene understanding, novel view synthesis and semantic SLAM. KITTI-360 will enable progress at the intersection of these research areas and thus contribute towards solving one of today's grand challenges: the development of fully autonomous self-driving systems. |
| Author | Liao, Yiyi Xie, Jun Geiger, Andreas |
| Author_xml | – sequence: 1 givenname: Yiyi orcidid: 0000-0001-6662-3022 surname: Liao fullname: Liao, Yiyi email: yiyi.liao@tue.mpg.de organization: Autonomous Vision Group, University of Tübingen and Max Planck Institute for Intelligent Systems, Tübingen, Germany – sequence: 2 givenname: Jun surname: Xie fullname: Xie, Jun email: junx@google.com organization: Google Research, Mountain View, CA, USA – sequence: 3 givenname: Andreas surname: Geiger fullname: Geiger, Andreas email: a.geiger@uni-tuebingen.de organization: Autonomous Vision Group, University of Tübingen and Max Planck Institute for Intelligent Systems, Tübingen, Germany |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35648872$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kUtPGzEURi1EBYHyB0CqLHXTzQT7evyY7lLSR1QoSEzWlmPfgaETD7UnlfrvO5CUBYuu7uac-_qOyH7sIxJyytmUc1ad1zezq8UUGMBUcF1JpvfIBLhiRQUV7JMJ4woKY8AckqOcHxjjpWTigBwKqUpjNExI_X1R14tCKPaRzuiP_jd2dO4Gl3GgLgb6CaO_X7v0M9OmT3SZVi7SW48R6TIGTHkYqTbe0TZSmD8rYv6WvGlcl_FkV4_J8svn-uJbcXn9dXExuyy8kHwoStaAAb9yCjwYFxADC8IzDk47sRIiGBnAVVC6IEMYGckC-qCkbqpGcXFMPmz7Pqb-1wbzYNdt9th1LmK_yRaUBs0UMzCi71-hD_0mxXE7C1qLUkpRliP1bkdtVmsM9jG14-1_7L9_jQBsAZ_6nBM2Lwhn9ikU-xyKfQrF7kIZJfNK8u3ghraPQ3Jt93_1bKu2iPgyq9JGKa3EXxHJlfo |
| CODEN | ITPIDJ |
| CitedBy_id | crossref_primary_10_3390_electronics14173429 crossref_primary_10_1109_LRA_2023_3325689 crossref_primary_10_1109_LRA_2024_3495584 crossref_primary_10_1007_s00371_024_03663_1 crossref_primary_10_1109_TGRS_2025_3591180 crossref_primary_10_3390_jsan14010015 crossref_primary_10_1109_TIM_2025_3584116 crossref_primary_10_1049_ipr2_12846 crossref_primary_10_1109_LRA_2023_3290526 crossref_primary_10_1109_TITS_2023_3241212 crossref_primary_10_1016_j_iswa_2025_200561 crossref_primary_10_1109_LRA_2024_3444671 crossref_primary_10_1109_TIM_2024_3406840 crossref_primary_10_1016_j_neunet_2025_107503 crossref_primary_10_1109_TITS_2024_3393125 crossref_primary_10_1109_TITS_2022_3210409 crossref_primary_10_3390_s24123878 crossref_primary_10_1109_TRO_2025_3526084 crossref_primary_10_1016_j_inffus_2024_102644 crossref_primary_10_1109_TAI_2024_3477425 crossref_primary_10_1007_s11263_024_02089_5 crossref_primary_10_1016_j_isprsjprs_2025_01_024 crossref_primary_10_1016_j_autcon_2024_105795 crossref_primary_10_1109_TFR_2025_3566694 crossref_primary_10_3390_jmse11061141 crossref_primary_10_1109_TITS_2023_3257086 crossref_primary_10_3390_app15148089 crossref_primary_10_1007_s11263_023_01790_1 crossref_primary_10_1109_ACCESS_2023_3246260 crossref_primary_10_1109_LRA_2025_3542695 crossref_primary_10_1109_TCSVT_2025_3551719 crossref_primary_10_1109_LRA_2025_3568306 crossref_primary_10_1109_TRO_2024_3400831 crossref_primary_10_3103_S1060992X24700711 crossref_primary_10_1109_TITS_2024_3439557 crossref_primary_10_3389_fcomp_2025_1561899 crossref_primary_10_1109_TIM_2024_3403194 crossref_primary_10_1109_JSEN_2023_3306377 crossref_primary_10_1109_TASE_2025_3588584 crossref_primary_10_1109_TIV_2023_3336940 crossref_primary_10_1109_LRA_2024_3460418 crossref_primary_10_1145_3728302 crossref_primary_10_1016_j_aei_2024_103036 crossref_primary_10_1109_OJVT_2025_3541891 crossref_primary_10_1016_j_patcog_2025_111801 crossref_primary_10_1080_10630732_2025_2468142 crossref_primary_10_1115_1_4069346 crossref_primary_10_3390_s24123863 crossref_primary_10_1109_LRA_2024_3440098 crossref_primary_10_1109_TRO_2025_3543302 crossref_primary_10_1007_s00138_023_01400_7 crossref_primary_10_1145_3707446 crossref_primary_10_3390_app14188150 crossref_primary_10_1109_TIV_2024_3394735 crossref_primary_10_1080_01691864_2022_2153080 crossref_primary_10_1109_TIP_2025_3597047 crossref_primary_10_1109_TIV_2024_3461651 crossref_primary_10_1109_LRA_2024_3400156 crossref_primary_10_1109_ACCESS_2023_3301119 crossref_primary_10_1109_LRA_2024_3393209 crossref_primary_10_1109_LRA_2023_3287791 crossref_primary_10_3390_data10010005 crossref_primary_10_1109_TIV_2024_3418497 crossref_primary_10_3390_math12132027 crossref_primary_10_1109_ACCESS_2023_3295212 crossref_primary_10_1109_TPAMI_2024_3408642 crossref_primary_10_1007_s00371_024_03321_6 crossref_primary_10_1109_ACCESS_2025_3539598 crossref_primary_10_1109_LRA_2022_3189425 crossref_primary_10_1111_cgf_15012 crossref_primary_10_1109_ACCESS_2025_3525805 crossref_primary_10_1109_LRA_2024_3495455 crossref_primary_10_1145_3715916 crossref_primary_10_1109_TIV_2024_3395653 crossref_primary_10_20965_jrm_2023_p1450 crossref_primary_10_1016_j_scs_2025_106384 crossref_primary_10_3389_fimag_2024_1387543 crossref_primary_10_1109_TPAMI_2025_3581608 crossref_primary_10_1007_s44163_025_00455_x crossref_primary_10_1109_LRA_2024_3357315 crossref_primary_10_1109_TPAMI_2024_3400402 crossref_primary_10_3390_rs16213984 crossref_primary_10_3390_s24020536 crossref_primary_10_3390_s25123602 crossref_primary_10_1177_02783649251344967 crossref_primary_10_1016_j_inffus_2024_102671 crossref_primary_10_1016_j_isprsjprs_2024_04_023 crossref_primary_10_1109_LRA_2022_3188106 crossref_primary_10_1111_cgf_70111 crossref_primary_10_1109_LRA_2024_3524907 crossref_primary_10_3390_machines11020126 crossref_primary_10_1007_s11263_024_02019_5 crossref_primary_10_1109_TCSVT_2024_3432510 crossref_primary_10_1109_TVCG_2025_3540669 crossref_primary_10_1109_TCSVT_2025_3527235 crossref_primary_10_3390_s24124011 crossref_primary_10_1109_LRA_2022_3223555 crossref_primary_10_3390_electronics12030703 crossref_primary_10_1016_j_engappai_2025_110024 crossref_primary_10_1109_TCI_2024_3380363 crossref_primary_10_1016_j_engappai_2024_109685 crossref_primary_10_3390_electronics14071447 crossref_primary_10_1109_TITS_2023_3268578 crossref_primary_10_1109_TII_2022_3189177 crossref_primary_10_1109_TIM_2024_3375408 crossref_primary_10_3390_s23249866 crossref_primary_10_1088_1361_6501_ad147a crossref_primary_10_1109_TRO_2025_3550771 crossref_primary_10_1109_TIV_2024_3423392 crossref_primary_10_1007_s41095_023_0358_0 crossref_primary_10_1016_j_aej_2025_02_063 crossref_primary_10_1007_s11227_025_07066_4 crossref_primary_10_1109_TIV_2024_3367919 crossref_primary_10_1109_JSTARS_2025_3552105 crossref_primary_10_1109_TCOMM_2024_3420708 crossref_primary_10_1111_mice_13417 crossref_primary_10_1016_j_inffus_2025_103351 crossref_primary_10_7746_jkros_2025_20_3_456 crossref_primary_10_1109_JIOT_2024_3475390 crossref_primary_10_3390_wevj15030085 crossref_primary_10_1109_TITS_2025_3558257 crossref_primary_10_3390_s24144676 crossref_primary_10_1109_TITS_2024_3388276 crossref_primary_10_3390_rs17020298 crossref_primary_10_3390_su15086750 crossref_primary_10_3390_s24072314 crossref_primary_10_1109_TMECH_2024_3454075 crossref_primary_10_1109_LRA_2024_3362681 crossref_primary_10_3390_s24175798 crossref_primary_10_1109_LRA_2023_3255560 crossref_primary_10_1109_JSTARS_2025_3590351 crossref_primary_10_1109_TRO_2024_3386363 crossref_primary_10_1007_s10462_024_10741_2 crossref_primary_10_3390_s23177633 crossref_primary_10_1038_s41597_025_04457_3 crossref_primary_10_1109_TII_2025_3534400 crossref_primary_10_1109_TIV_2024_3360273 crossref_primary_10_1109_TVT_2022_3183202 crossref_primary_10_1109_TITS_2024_3456293 crossref_primary_10_1109_TPAMI_2023_3262817 crossref_primary_10_1117_1_JRS_18_046507 crossref_primary_10_1109_TITS_2023_3300537 crossref_primary_10_1016_j_cag_2024_104014 crossref_primary_10_1109_LRA_2024_3356793 crossref_primary_10_3389_fcomp_2025_1467103 crossref_primary_10_1109_LRA_2023_3300281 crossref_primary_10_1109_TIM_2023_3315358 crossref_primary_10_1109_TITS_2025_3525542 crossref_primary_10_1016_j_cag_2025_104293 crossref_primary_10_1109_JSEN_2025_3562284 crossref_primary_10_1016_j_knosys_2025_114321 crossref_primary_10_1080_17538947_2025_2478584 crossref_primary_10_1109_TITS_2024_3374342 crossref_primary_10_1016_j_inffus_2025_103015 crossref_primary_10_1177_02783649251316881 crossref_primary_10_1016_j_cagd_2025_102453 crossref_primary_10_1016_j_robot_2024_104900 crossref_primary_10_1145_3687762 crossref_primary_10_3390_f15091559 crossref_primary_10_3390_s24216790 crossref_primary_10_1177_02783649241278369 crossref_primary_10_1007_s10791_025_09646_7 crossref_primary_10_1109_TASE_2024_3394519 crossref_primary_10_1109_LRA_2022_3229224 crossref_primary_10_1109_LRA_2025_3536206 crossref_primary_10_1016_j_ifacol_2025_07_009 crossref_primary_10_1109_TPAMI_2025_3549711 crossref_primary_10_3390_agronomy15010242 crossref_primary_10_3390_s23218981 crossref_primary_10_1109_ACCESS_2024_3469550 crossref_primary_10_1109_LRA_2024_3419648 crossref_primary_10_1109_TNNLS_2024_3495045 crossref_primary_10_7717_peerj_cs_2656 crossref_primary_10_1109_LRA_2023_3290516 crossref_primary_10_3390_s24020699 crossref_primary_10_1109_LRA_2023_3333742 crossref_primary_10_1109_LRA_2025_3592057 crossref_primary_10_1109_TRO_2025_3530273 crossref_primary_10_1007_s00530_023_01185_9 crossref_primary_10_1109_ACCESS_2025_3566472 crossref_primary_10_1109_TGRS_2024_3389949 crossref_primary_10_1155_2023_3895703 |
| Cites_doi | 10.1109/CVPR.2019.00976 10.1109/CVPR.2018.00096 10.1109/2945.817351 10.1109/CVPR.2018.00068 10.1109/IROS.2018.8593828 10.1109/CVPR.2017.261 10.1109/TRO.2019.2909168 10.5194/isprs-annals-iv-1-w1-91-2017 10.1007/978-3-642-33715-4_36 10.1109/TPAMI.2011.131 10.1109/ICCV.2019.00939 10.1109/IROS.2014.6942637 10.1111/cgf.14339 10.1007/978-3-319-10602-1_34 10.1109/ICCV.2019.00937 10.1109/CVPR.2016.352 10.1109/CVPR.2017.592 10.1109/TPAMI.2004.60 10.1007/978-3-319-10602-1_22 10.1109/CVPR.2015.7298655 10.1109/LRA.2018.2866205 10.1109/CVPR.2013.269 10.1109/CVPR.2019.00902 10.1007/978-3-642-32717-9_3 10.1109/ICRA.2014.6907507 10.1109/CVPR.2016.401 10.1109/ICRA.2012.6225003 10.1007/s11263-014-0713-9 10.5555/3001460.3001507 10.1109/CVPR42600.2020.01164 10.1007/978-3-319-10602-1_48 10.1109/CVPR.2017.441 10.5244/C.24.27 10.1109/CVPR.2019.00540 10.1109/CVPR46437.2021.00548 10.1109/IROS.2017.8206371 10.1109/ICCV.2009.5459249 10.1145/3072959.3073599 10.1109/CVPR46437.2021.00399 10.1109/ICRA40945.2020.9197385 10.1109/CVPRW50498.2020.00109 10.1109/CVPR46437.2021.01129 10.1109/TPAMI.2019.2926463 10.1109/CVPR.2016.470 10.1109/34.88573 10.1109/CVPR42600.2020.01224 10.1109/IROS.2013.6696592 10.1109/CVPR.2019.00895 10.1007/978-3-319-10593-2_43 10.1109/CVPR.2011.5995693 10.1109/CVPRW.2019.00172 10.1109/CVPR.2010.5540054 10.1561/0600000079 10.1109/CVPR.2017.477 10.1109/ICCVW.2017.36 10.1109/CVPR42600.2020.00492 10.1109/CVPR.2014.409 10.1007/978-3-030-58452-8_24 10.1145/3240508.3241916 10.1109/LRA.2019.2923954 10.1109/CVPR.2012.6248074 10.1007/978-3-030-58529-7_37 10.1109/LRA.2018.2852782 10.1109/CVPR.2015.7299075 10.1109/CVPR.2015.7298965 10.1007/978-3-642-15549-9_32 10.1109/CVPR42600.2020.01113 10.1109/CVPR.2019.00906 10.1109/ICCV.2017.534 10.1109/ICCV.2019.00940 10.1109/CVPR.2016.350 10.1109/CVPR42600.2020.00252 10.1109/TRO.2017.2705103 10.1109/CVPR46437.2021.00288 10.1109/ICRA.2018.8460952 10.1007/978-3-319-46475-6_7 10.1109/WACV.2015.139 10.1177/0278364913491297 10.1109/cvpr52688.2022.01254 10.1109/CVPR46437.2021.00715 10.1109/CVPRW.2018.00272 10.1109/CVPR.2018.00780 10.1109/CVPR46437.2021.00607 10.1007/978-3-642-40395-8_14 10.1109/ICRA40945.2020.9196885 10.1145/345370.345398 10.1007/s11263-013-0673-5 10.1109/CVPR.2017.272 10.1007/978-3-319-24947-6_15 10.1109/CVPR.2017.660 10.1007/s10514-012-9321-0 10.1109/CVPR42600.2020.00591 10.1109/TPAMI.2007.1166 10.1109/TPAMI.2018.2844175 10.1016/j.patrec.2008.04.005 10.1109/ICCV.2017.477 10.1109/ICCV.2013.458 10.1109/3DV.2017.00081 10.1109/ICCV.2013.379 10.1109/CVPR42600.2020.00988 10.1109/ICCV48922.2021.00580 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| DOI | 10.1109/TPAMI.2022.3179507 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | Technology Research Database MEDLINE - Academic PubMed |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE/IET Electronic Library url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science |
| EISSN | 2160-9292 1939-3539 |
| EndPage | 3310 |
| ExternalDocumentID | 35648872 10_1109_TPAMI_2022_3179507 9786676 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: ERC grantid: LEGO-3D 850533 – fundername: German Federal Ministry of Education and Research grantid: 01IS18039A – fundername: DFG EXC grantid: 390727645 |
| GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ IEDLZ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 UHB ~02 AAYXX CITATION 5VS 9M8 AAYOK ABFSI ADRHT AETIX AGSQL AI. AIBXA ALLEH FA8 H~9 IBMZZ ICLAB IFJZH NPM PKN RIC RIG RNI RZB VH1 XJT Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| ID | FETCH-LOGICAL-c351t-40f282cba62c28adeed0d3c012a7a3b33d85d2a924ad5ddc2850decd657f9f613 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 333 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000966831200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0162-8828 1939-3539 |
| IngestDate | Sat Sep 27 19:47:06 EDT 2025 Mon Jun 30 06:20:06 EDT 2025 Wed Feb 19 02:24:39 EST 2025 Sat Nov 29 02:58:20 EST 2025 Tue Nov 18 22:31:03 EST 2025 Wed Aug 27 02:03:57 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 3 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c351t-40f282cba62c28adeed0d3c012a7a3b33d85d2a924ad5ddc2850decd657f9f613 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0001-6662-3022 |
| PMID | 35648872 |
| PQID | 2773455344 |
| PQPubID | 85458 |
| PageCount | 19 |
| ParticipantIDs | crossref_primary_10_1109_TPAMI_2022_3179507 ieee_primary_9786676 proquest_miscellaneous_2672706082 pubmed_primary_35648872 proquest_journals_2773455344 crossref_citationtrail_10_1109_TPAMI_2022_3179507 |
| PublicationCentury | 2000 |
| PublicationDate | 2023-03-01 |
| PublicationDateYYYYMMDD | 2023-03-01 |
| PublicationDate_xml | – month: 03 year: 2023 text: 2023-03-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
| PublicationTitleAbbrev | TPAMI |
| PublicationTitleAlternate | IEEE Trans Pattern Anal Mach Intell |
| PublicationYear | 2023 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref57 ref56 ref59 ref58 ref53 ref52 ref54 Madhavan (ref62) 2017 Kesten (ref50) 2019 Winston (ref104) 1971 ref51 ref46 ref45 ref48 ref47 ref42 ref41 ref44 ref49 Geyer (ref36) 2020 ref8 ref7 ref9 Hoffman (ref43) 2016 ref4 ref3 Zeiler (ref113) 2012 ref6 ref5 ref100 ref101 ref40 ref35 ref34 ref31 ref30 ref33 ref32 Grupp (ref37) 2017 Krähenbühl (ref55) Munoz (ref65) Cabon (ref17) 2020 ref39 ref38 Zolanvari (ref119) Roberts (ref85) 1963 ref24 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref27 ref13 ref12 Qi (ref78) ref15 ref14 ref97 ref96 ref11 ref99 ref10 ref98 ref16 ref19 ref18 Dosovitskiy (ref29) ref93 ref95 ref94 ref91 ref90 ref89 Qi (ref77) ref86 ref88 ref87 Weber (ref103) ref82 ref81 ref84 ref83 ref80 ref79 ref108 ref109 ref106 ref107 ref75 ref74 ref105 ref102 ref76 ref2 ref71 ref111 ref70 ref112 ref72 ref110 ref68 ref67 Seitz (ref92) 1999; 33 ref117 ref69 ref118 ref64 ref115 ref63 ref116 ref66 ref114 Osborne (ref73) 2008 ref60 ref61 |
| References_xml | – ident: ref56 doi: 10.1109/CVPR.2019.00976 – year: 2008 ident: ref73 article-title: The mercator projections – ident: ref2 doi: 10.1109/CVPR.2018.00096 – ident: ref10 doi: 10.1109/2945.817351 – ident: ref115 doi: 10.1109/CVPR.2018.00068 – ident: ref12 doi: 10.1109/IROS.2018.8593828 – ident: ref26 doi: 10.1109/CVPR.2017.261 – ident: ref109 doi: 10.1109/TRO.2019.2909168 – ident: ref39 doi: 10.5194/isprs-annals-iv-1-w1-91-2017 – year: 2017 ident: ref62 article-title: The BDD-Nexar collective: A large-scale, crowsourced, dataset of driving scenes – ident: ref100 doi: 10.1007/978-3-642-33715-4_36 – year: 2019 ident: ref50 article-title: Level 5 perception dataset 2020 – ident: ref60 doi: 10.1109/TPAMI.2011.131 – ident: ref8 doi: 10.1109/ICCV.2019.00939 – ident: ref90 doi: 10.1109/IROS.2014.6942637 – ident: ref54 doi: 10.1111/cgf.14339 – start-page: 77 volume-title: Proc. IEEE Conf. Comput. Vis. Pattern Recognit. ident: ref77 article-title: PointNet: Deep learning on point sets for 3D classification and segmentation – ident: ref84 doi: 10.1007/978-3-319-10602-1_34 – ident: ref76 doi: 10.1109/ICCV.2019.00937 – ident: ref86 doi: 10.1109/CVPR.2016.352 – ident: ref67 doi: 10.1109/CVPR.2017.592 – ident: ref11 doi: 10.1109/TPAMI.2004.60 – ident: ref51 doi: 10.1007/978-3-319-10602-1_22 – ident: ref93 doi: 10.1109/CVPR.2015.7298655 – ident: ref71 doi: 10.1109/LRA.2018.2866205 – ident: ref99 doi: 10.1109/CVPR.2013.269 – start-page: 5105 volume-title: Proc. Int. Conf. Neural Inf. Process. Syst. ident: ref78 article-title: PointNet++: Deep hierarchical feature learning on point sets in a metric space – ident: ref108 doi: 10.1109/CVPR.2019.00902 – year: 2016 ident: ref43 article-title: FCNs in the wild: Pixel-level adversarial and constraint-based adaptation – ident: ref68 doi: 10.1007/978-3-642-32717-9_3 – ident: ref91 doi: 10.1109/ICRA.2014.6907507 – ident: ref107 doi: 10.1109/CVPR.2016.401 – ident: ref9 doi: 10.1109/ICRA.2012.6225003 – ident: ref38 doi: 10.1007/s11263-014-0713-9 – ident: ref30 doi: 10.5555/3001460.3001507 – start-page: 1 volume-title: Proc. Conf. Robot Learn. ident: ref29 article-title: CARLA: An open urban driving simulator – ident: ref18 doi: 10.1109/CVPR42600.2020.01164 – ident: ref58 doi: 10.1007/978-3-319-10602-1_48 – ident: ref117 doi: 10.1109/CVPR.2017.441 – ident: ref15 doi: 10.5244/C.24.27 – ident: ref59 doi: 10.1109/CVPR.2019.00540 – ident: ref4 doi: 10.1109/CVPR46437.2021.00548 – ident: ref81 doi: 10.1109/IROS.2017.8206371 – ident: ref106 doi: 10.1109/ICCV.2009.5459249 – ident: ref53 doi: 10.1145/3072959.3073599 – ident: ref80 doi: 10.1109/CVPR46437.2021.00399 – year: 2012 ident: ref113 article-title: ADADELTA: An adaptive learning rate method – ident: ref75 doi: 10.1109/ICRA40945.2020.9197385 – ident: ref96 doi: 10.1109/CVPRW50498.2020.00109 – ident: ref72 doi: 10.1109/CVPR46437.2021.01129 – ident: ref45 doi: 10.1109/TPAMI.2019.2926463 – ident: ref32 doi: 10.1109/CVPR.2016.470 – ident: ref98 doi: 10.1109/34.88573 – ident: ref112 doi: 10.1109/CVPR42600.2020.01224 – ident: ref41 doi: 10.1109/IROS.2013.6696592 – ident: ref21 doi: 10.1109/CVPR.2019.00895 – ident: ref46 doi: 10.1007/978-3-319-10593-2_43 – ident: ref48 doi: 10.1109/CVPR.2011.5995693 – year: 2020 ident: ref36 article-title: A2D2: Audi autonomous driving dataset – ident: ref23 doi: 10.1109/CVPRW.2019.00172 – ident: ref6 doi: 10.1109/CVPR.2010.5540054 – ident: ref47 doi: 10.1561/0600000079 – ident: ref19 doi: 10.1109/CVPR.2017.477 – ident: ref16 doi: 10.1109/ICCVW.2017.36 – ident: ref49 doi: 10.1109/CVPR42600.2020.00492 – ident: ref22 doi: 10.1109/CVPR.2014.409 – ident: ref64 doi: 10.1007/978-3-030-58452-8_24 – year: 2020 ident: ref17 article-title: Virtual KITTI 2 – start-page: 668 volume-title: Proc. Eur. Conf. Comput. Vis. ident: ref65 article-title: Co-inference machines for multi-modal scene analysis – ident: ref3 doi: 10.1145/3240508.3241916 – start-page: 109 volume-title: Proc. Int. Conf. Neural Inf. Process. Syst. ident: ref55 article-title: Efficient inference in fully connected CRFs with Gaussian edge potentials – ident: ref28 doi: 10.1109/LRA.2019.2923954 – ident: ref34 doi: 10.1109/CVPR.2012.6248074 – ident: ref83 doi: 10.1007/978-3-030-58529-7_37 – ident: ref102 doi: 10.1109/LRA.2018.2852782 – ident: ref63 doi: 10.1109/CVPR.2015.7299075 – ident: ref61 doi: 10.1109/CVPR.2015.7298965 – volume-title: Proc. Neural Inf. Process. Syst. Track Datasets Benchmarks ident: ref103 article-title: STEP: Segmenting and tracking every pixel – ident: ref95 doi: 10.1007/978-3-642-15549-9_32 – year: 2017 ident: ref37 article-title: EVO: Python package for the evaluation of odometry and SLAM – ident: ref110 doi: 10.1109/CVPR42600.2020.01113 – ident: ref118 doi: 10.1109/CVPR.2019.00906 – ident: ref70 doi: 10.1109/ICCV.2017.534 – ident: ref111 doi: 10.1109/ICCV.2019.00940 – ident: ref25 doi: 10.1109/CVPR.2016.350 – year: 1971 ident: ref104 article-title: Heterarchy in the M.I.T. robot – ident: ref94 doi: 10.1109/CVPR42600.2020.00252 – ident: ref66 doi: 10.1109/TRO.2017.2705103 – ident: ref74 doi: 10.1109/CVPR46437.2021.00288 – ident: ref14 doi: 10.1109/ICRA.2018.8460952 – ident: ref82 doi: 10.1007/978-3-319-46475-6_7 – ident: ref69 doi: 10.1109/WACV.2015.139 – volume-title: Proc. Brit. Mach. Vis. Conf. ident: ref119 article-title: DublinCity: Annotated LiDAR point cloud and its applications – ident: ref33 doi: 10.1177/0278364913491297 – ident: ref27 doi: 10.1109/cvpr52688.2022.01254 – ident: ref24 doi: 10.1109/CVPR46437.2021.00715 – ident: ref88 doi: 10.1109/CVPRW.2018.00272 – ident: ref97 doi: 10.1109/CVPR.2018.00780 – ident: ref79 doi: 10.1109/CVPR46437.2021.00607 – year: 1963 ident: ref85 article-title: Machine perception of three-dimensional solids – ident: ref101 doi: 10.1007/978-3-642-40395-8_14 – ident: ref87 doi: 10.1109/ICRA40945.2020.9196885 – volume: 33 start-page: 35 year: 1999 ident: ref92 article-title: Applications of computer vision to computer graphics publication-title: Comput. Graph. doi: 10.1145/345370.345398 – ident: ref5 doi: 10.1007/s11263-013-0673-5 – ident: ref89 doi: 10.1109/CVPR.2017.272 – ident: ref35 doi: 10.1007/978-3-319-24947-6_15 – ident: ref116 doi: 10.1109/CVPR.2017.660 – ident: ref44 doi: 10.1007/s10514-012-9321-0 – ident: ref57 doi: 10.1109/CVPR42600.2020.00591 – ident: ref42 doi: 10.1109/TPAMI.2007.1166 – ident: ref40 doi: 10.1109/TPAMI.2018.2844175 – ident: ref13 doi: 10.1016/j.patrec.2008.04.005 – ident: ref31 doi: 10.1109/ICCV.2017.477 – ident: ref105 doi: 10.1109/ICCV.2013.458 – ident: ref20 doi: 10.1109/3DV.2017.00081 – ident: ref114 doi: 10.1109/ICCV.2013.379 – ident: ref52 doi: 10.1109/CVPR42600.2020.00988 – ident: ref7 doi: 10.1109/ICCV48922.2021.00580 |
| SSID | ssj0014503 |
| Score | 2.765353 |
| Snippet | For the last few decades, several major subfields of artificial intelligence including computer vision, graphics, and robotics have progressed largely... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 3292 |
| SubjectTerms | Annotations Artificial intelligence Autonomous cars Benchmark testing Benchmarks Cameras Computer vision Datasets performance evaluation Point cloud labeling Robotics Scene analysis scene understanding self-driving semantic label transfer Semantics Task analysis Three-dimensional displays |
| Title | KITTI-360: A Novel Dataset and Benchmarks for Urban Scene Understanding in 2D and 3D |
| URI | https://ieeexplore.ieee.org/document/9786676 https://www.ncbi.nlm.nih.gov/pubmed/35648872 https://www.proquest.com/docview/2773455344 https://www.proquest.com/docview/2672706082 |
| Volume | 45 |
| WOSCitedRecordID | wos000966831200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE/IET Electronic Library customDbUrl: eissn: 2160-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014503 issn: 0162-8828 databaseCode: RIE dateStart: 19790101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fb9MwED5tE0LbA4ONQWBMRuINAq4dxwlvHWWiQlR7SKW-RY59FdNGOrXp_n7Ozg8NCZD2FinnJMrd5b7L-e4DeIfaoXAJ-XeuMU4yYeKqQp-qcMMpgPsJKIFsQs9m2WKRX-7Ah6EXBhHD5jP86A9DLd-t7Nb_KvPTYP2WzF3Y1Vq3vVpDxSBRgQWZEAx5OKURfYMMzz8Vl-MfU0oFhaAMVeeEgPbhsVQp2a4Wf8SjQLDyb6wZYs7F4cOe9ik86bAlG7fG8Ax2sD6Cw563gXVufAQH94YQHkPxfVoU01im_DMbs9nqDm_YxDQU3RpmasfOadHPX2Z9vWEEcNl8XZmarkXfSDa_3xnDrmomJmGJnDyH-cXX4su3uONaiK1Uo4Z0s6Tky1YmFVZkxlHo5E5aCl9GG1lJ6TLlhKFszTjlHMko7tC6VOllviRMcAJ79arGl8BcklrNc-0ygpYGCeEsR374qDBJig5dBKP-jZe2G0Tu-TBuypCQ8LwMCiu9wspOYRG8H9bctmM4_it97NUxSHaaiOC0V2zZeeqmFFrLRCmZJBG8HU6Tj_nCialxtSUZX67mKaGlCF60BjFcu7ejV3-_52vY9wT17a61U9hr1lt8A4_sXXO1WZ-RIS-ys2DIvwGdB-iz |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dT9RAEJ8gGsEHUEAtoK6Jb1Lc249uy9vJQbgAFx56CW_NdncvErFn7nr8_c5uP4KJmvjWpLNt05np_KazMz-AT05Zx6xA_86Ui0XKdFyWzqcqVFMM4H4CSiCbUJNJenub3azBUd8L45wLm8_csT8MtXw7Nyv_q8xPg_VbMp_AUykEGzTdWn3NQMjAg4wYBn0cE4muRYZmX_Kb4fUYk0HGMEdVGWKgTXjOZYLWq9hvESlQrPwdbYaoc779f8_7ErZadEmGjTm8gjVX7cB2x9xAWkfegRePxhDuQn45zvNxzBN6QoZkMn9w92Ska4xvNdGVJV9x0bcfevF9SRDikumi1BVeC7-SZPq4N4bcVYSNwhI-2oPp-Vl-ehG3bAux4XJQo3ZmmH6ZUifMsFRbDJ7UcoMBTCvNS85tKi3TmK9pK61FGUmtMzaRapbNEBW8hvVqXrm3QKxIjKKZsimCS-0Q48wGfvwo0yJx1tkIBt0bL0w7itwzYtwXISWhWREUVniFFa3CIvjcr_nZDOL4p_SuV0cv2WoigsNOsUXrq8uCKcWFlFyICD72p9HLfOlEV26-QhlfsKYJ4qUI3jQG0V-7s6P9P9_zA2xc5NdXxdV4cnkAm56uvtnDdgjr9WLl3sEz81DfLRfvgzn_AmTo6xI |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=KITTI-360%3A+A+Novel+Dataset+and+Benchmarks+for+Urban+Scene+Understanding+in+2D+and+3D&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Liao%2C+Yiyi&rft.au=Xie%2C+Jun&rft.au=Geiger%2C+Andreas&rft.date=2023-03-01&rft.eissn=1939-3539&rft.volume=45&rft.issue=3&rft.spage=3292&rft_id=info:doi/10.1109%2FTPAMI.2022.3179507&rft_id=info%3Apmid%2F35648872&rft.externalDocID=35648872 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |