Camera Constraint-Free View-Based 3-D Object Retrieval

Recently, extensive research efforts have been dedicated to view-based methods for 3-D object retrieval due to the highly discriminative property of multiviews for 3-D object representation. However, most of state-of-the-art approaches highly depend on their own camera array settings for capturing v...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on image processing Ročník 21; číslo 4; s. 2269 - 2281
Hlavní autori: Yue Gao, Jinhui Tang, Hong, Richang, Shuicheng Yan, Qionghai Dai, Naiyao Zhang, Tat-Seng Chua
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.04.2012
Predmet:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Recently, extensive research efforts have been dedicated to view-based methods for 3-D object retrieval due to the highly discriminative property of multiviews for 3-D object representation. However, most of state-of-the-art approaches highly depend on their own camera array settings for capturing views of 3-D objects. In order to move toward a general framework for 3-D object retrieval without the limitation of camera array restriction, a camera constraint-free view-based (CCFV) 3-D object retrieval algorithm is proposed in this paper. In this framework, each object is represented by a free set of views, which means that these views can be captured from any direction without camera constraint. For each query object, we first cluster all query views to generate the view clusters, which are then used to build the query models. For a more accurate 3-D object comparison, a positive matching model and a negative matching model are individually trained using positive and negative matched samples, respectively. The CCFV model is generated on the basis of the query Gaussian models by combining the positive matching model and the negative matching model. The CCFV removes the constraint of static camera array settings for view capturing and can be applied to any view-based 3-D object database. We conduct experiments on the National Taiwan University 3-D model database and the ETH 3-D object database. Experimental results show that the proposed scheme can achieve better performance than state-of-the-art methods.
AbstractList Recently, extensive research efforts have been dedicated to view-based methods for 3-D object retrieval due to the highly discriminative property of multiviews for 3-D object representation. However, most of state-of-the-art approaches highly depend on their own camera array settings for capturing views of 3-D objects. In order to move toward a general framework for 3-D object retrieval without the limitation of camera array restriction, a camera constraint-free view-based (CCFV) 3-D object retrieval algorithm is proposed in this paper. In this framework, each object is represented by a free set of views, which means that these views can be captured from any direction without camera constraint. For each query object, we first cluster all query views to generate the view clusters, which are then used to build the query models. For a more accurate 3-D object comparison, a positive matching model and a negative matching model are individually trained using positive and negative matched samples, respectively. The CCFV model is generated on the basis of the query Gaussian models by combining the positive matching model and the negative matching model. The CCFV removes the constraint of static camera array settings for view capturing and can be applied to any view-based 3-D object database. We conduct experiments on the National Taiwan University 3-D model database and the ETH 3-D object database. Experimental results show that the proposed scheme can achieve better performance than state-of-the-art methods.
Recently, extensive research efforts have been dedicated to view-based methods for 3-D object retrieval due to the highly discriminative property of multiviews for 3-D object representation. However, most of state-of-the-art approaches highly depend on their own camera array settings for capturing views of 3-D objects. In order to move toward a general framework for 3-D object retrieval without the limitation of camera array restriction, a camera constraint-free view-based (CCFV) 3-D object retrieval algorithm is proposed in this paper. In this framework, each object is represented by a free set of views, which means that these views can be captured from any direction without camera constraint. For each query object, we first cluster all query views to generate the view clusters, which are then used to build the query models. For a more accurate 3-D object comparison, a positive matching model and a negative matching model are individually trained using positive and negative matched samples, respectively. The CCFV model is generated on the basis of the query Gaussian models by combining the positive matching model and the negative matching model. The CCFV removes the constraint of static camera array settings for view capturing and can be applied to any view-based 3-D object database. We conduct experiments on the National Taiwan University 3-D model database and the ETH 3-D object database. Experimental results show that the proposed scheme can achieve better performance than state-of-the-art methods.Recently, extensive research efforts have been dedicated to view-based methods for 3-D object retrieval due to the highly discriminative property of multiviews for 3-D object representation. However, most of state-of-the-art approaches highly depend on their own camera array settings for capturing views of 3-D objects. In order to move toward a general framework for 3-D object retrieval without the limitation of camera array restriction, a camera constraint-free view-based (CCFV) 3-D object retrieval algorithm is proposed in this paper. In this framework, each object is represented by a free set of views, which means that these views can be captured from any direction without camera constraint. For each query object, we first cluster all query views to generate the view clusters, which are then used to build the query models. For a more accurate 3-D object comparison, a positive matching model and a negative matching model are individually trained using positive and negative matched samples, respectively. The CCFV model is generated on the basis of the query Gaussian models by combining the positive matching model and the negative matching model. The CCFV removes the constraint of static camera array settings for view capturing and can be applied to any view-based 3-D object database. We conduct experiments on the National Taiwan University 3-D model database and the ETH 3-D object database. Experimental results show that the proposed scheme can achieve better performance than state-of-the-art methods.
Author Hong, Richang
Tat-Seng Chua
Jinhui Tang
Shuicheng Yan
Naiyao Zhang
Qionghai Dai
Yue Gao
Author_xml – sequence: 1
  surname: Yue Gao
  fullname: Yue Gao
  email: gaoyue08@mails.tsinghua.edu.cn
  organization: Dept. of Autom., Tsinghua Univ., Beijing, China
– sequence: 2
  surname: Jinhui Tang
  fullname: Jinhui Tang
  email: jinhui-tang@mail.njust.edu.cn
  organization: Sch. of Comput. Sci. & Technol., Nanjing Univ. of Sci. & Technol., Nanjing, China
– sequence: 3
  givenname: Richang
  surname: Hong
  fullname: Hong, Richang
  email: hongrc.hfut@gmail.com
  organization: Sch. of Comput. & Inf. Sci., Hefei Univ. of Technol., Hefei, China
– sequence: 4
  surname: Shuicheng Yan
  fullname: Shuicheng Yan
  email: eleyans@nus.edu.sg
  organization: Dept. of Electr. & Comput. Eng., Nat. Univ. of Singapore, Singapore, Singapore
– sequence: 5
  surname: Qionghai Dai
  fullname: Qionghai Dai
  email: qhdai@tsinghua.edu.cn
  organization: Dept. of Autom., Tsinghua Univ., Beijing, China
– sequence: 6
  surname: Naiyao Zhang
  fullname: Naiyao Zhang
  email: zlh@tsinghua.edu.cn
  organization: Dept. of Autom., Tsinghua Univ., Beijing, China
– sequence: 7
  surname: Tat-Seng Chua
  fullname: Tat-Seng Chua
  email: chuats@comp.nus.edu.sg
  organization: Sch. of Comput., Nat. Univ. of Singapore, Singapore, Singapore
BackLink https://www.ncbi.nlm.nih.gov/pubmed/21965212$$D View this record in MEDLINE/PubMed
BookMark eNp9kEtLw0AUhQep2IfuBUGyc5V6ZyaZSZYarRYKFaluh8nkDqTkUWdSxX9vSlsXLlzdu_i-A-eMyaBpGyTkksKUUkhvV_OXKQNKp4xKgISekBFNIxoCRGzQ_xDLUNIoHZKx92sAGsVUnJEho6mIGWUjIjJdo9NB1ja-c7psunDmEIP3Er_Ce-2xCHj4ECzzNZoueMXOlfipq3NyanXl8eJwJ-Rt9rjKnsPF8mme3S1CwwV0oYE8F9ZaFsWFhMTInBWFZGlijeZGa16wwlieSM0EWKtjwRMrpBZG6AKSnE_IzT5349qPLfpO1aU3WFW6wXbrVRpBwiVPWU9eH8htXmOhNq6stftWx6o9IPaAca33Dq0yZae7sm12vStFQe02Vf2marepOmzai_BHPGb_o1ztlRIRf3EBHFIu-A-xZn-O
CODEN IIPRE4
CitedBy_id crossref_primary_10_1016_j_sigpro_2014_09_005
crossref_primary_10_1109_TIP_2014_2372618
crossref_primary_10_1007_s11042_012_1130_0
crossref_primary_10_1007_s11042_014_2055_6
crossref_primary_10_1007_s11042_017_5208_6
crossref_primary_10_1007_s11042_018_5641_1
crossref_primary_10_1016_j_future_2018_12_039
crossref_primary_10_1007_s11063_019_10155_0
crossref_primary_10_1016_j_neucom_2014_02_040
crossref_primary_10_1016_j_jvcir_2015_06_003
crossref_primary_10_1016_j_sigpro_2014_08_041
crossref_primary_10_1016_j_neucom_2012_04_040
crossref_primary_10_1109_TIE_2013_2262760
crossref_primary_10_1145_3377876
crossref_primary_10_1007_s11042_016_3340_3
crossref_primary_10_1016_j_neucom_2016_01_113
crossref_primary_10_1109_TCYB_2014_2351831
crossref_primary_10_1016_j_neucom_2014_05_090
crossref_primary_10_1007_s11042_013_1560_3
crossref_primary_10_1016_j_neucom_2014_05_091
crossref_primary_10_1016_j_neucom_2014_05_093
crossref_primary_10_1109_TCYB_2021_3051016
crossref_primary_10_1007_s11760_015_0756_6
crossref_primary_10_1007_s00530_015_0454_9
crossref_primary_10_1007_s00371_018_1597_4
crossref_primary_10_1007_s11263_024_02298_y
crossref_primary_10_1016_j_dsp_2012_11_007
crossref_primary_10_1088_1742_6596_536_1_012023
crossref_primary_10_1016_j_sigpro_2014_06_025
crossref_primary_10_1007_s11042_020_08967_7
crossref_primary_10_1007_s11042_014_2029_8
crossref_primary_10_1016_j_neucom_2016_01_126
crossref_primary_10_1016_j_sigpro_2014_10_021
crossref_primary_10_1145_2542205_2542209
crossref_primary_10_1016_j_jvcir_2015_06_011
crossref_primary_10_1109_TIP_2012_2202676
crossref_primary_10_1016_j_neucom_2016_01_125
crossref_primary_10_1016_j_sigpro_2012_06_028
crossref_primary_10_1007_s11042_016_4250_0
crossref_primary_10_1109_TIE_2014_2327558
crossref_primary_10_1371_journal_pone_0071511
crossref_primary_10_1007_s00138_013_0535_8
crossref_primary_10_1016_j_neucom_2016_01_081
crossref_primary_10_1109_TMM_2020_3015554
crossref_primary_10_1371_journal_pone_0047041
crossref_primary_10_1016_j_neucom_2015_06_118
crossref_primary_10_1007_s11042_016_4238_9
crossref_primary_10_1016_j_sigpro_2012_04_002
crossref_primary_10_1109_TIP_2016_2614132
crossref_primary_10_3390_app122211523
crossref_primary_10_1016_j_neunet_2020_02_017
crossref_primary_10_1109_TCSVT_2018_2810191
crossref_primary_10_1016_j_neucom_2016_06_087
crossref_primary_10_1016_j_imavis_2012_05_001
crossref_primary_10_1109_ACCESS_2020_2968460
crossref_primary_10_1007_s11063_017_9717_0
crossref_primary_10_1080_13658816_2017_1334888
crossref_primary_10_1007_s11042_014_2321_7
crossref_primary_10_1007_s11042_017_5136_5
crossref_primary_10_1016_j_neucom_2012_02_027
crossref_primary_10_1007_s11042_020_08817_6
crossref_primary_10_1109_TIP_2024_3518759
crossref_primary_10_1109_TMM_2014_2351788
crossref_primary_10_1109_ACCESS_2019_2929109
crossref_primary_10_1016_j_imavis_2014_12_003
crossref_primary_10_1109_ACCESS_2020_3012692
crossref_primary_10_1109_TCSVT_2021_3099496
crossref_primary_10_1007_s11704_016_5453_2
crossref_primary_10_1016_j_ins_2012_01_003
crossref_primary_10_1109_TCYB_2020_3008248
crossref_primary_10_1007_s10489_021_02840_2
crossref_primary_10_1007_s11042_012_1284_9
crossref_primary_10_1016_j_ins_2014_11_014
crossref_primary_10_1016_j_jvcir_2012_06_005
crossref_primary_10_1016_j_sigpro_2014_08_034
crossref_primary_10_1016_j_ins_2014_09_015
crossref_primary_10_1007_s11042_016_3305_6
crossref_primary_10_1109_TIP_2016_2540802
crossref_primary_10_1016_j_patrec_2018_07_005
crossref_primary_10_1016_j_sigpro_2014_11_001
crossref_primary_10_1016_j_neucom_2014_03_090
crossref_primary_10_1016_j_neucom_2014_03_091
crossref_primary_10_1109_TCYB_2017_2664503
crossref_primary_10_1007_s11042_013_1416_x
crossref_primary_10_1007_s11042_020_09248_z
crossref_primary_10_1016_j_sigpro_2014_08_038
crossref_primary_10_1007_s00530_014_0405_x
crossref_primary_10_1109_ACCESS_2021_3056613
crossref_primary_10_1016_j_neucom_2015_09_120
crossref_primary_10_1007_s00138_013_0559_0
crossref_primary_10_1007_s11042_015_2622_5
crossref_primary_10_1007_s11042_017_5076_0
crossref_primary_10_1016_j_neucom_2014_03_092
crossref_primary_10_1371_journal_pone_0055586
crossref_primary_10_1007_s11042_014_2337_z
crossref_primary_10_1109_TNNLS_2015_2495148
crossref_primary_10_1016_j_ins_2015_08_007
crossref_primary_10_1186_s42492_023_00145_4
crossref_primary_10_1007_s00500_014_1237_5
crossref_primary_10_1016_j_ins_2015_03_023
crossref_primary_10_1016_j_neucom_2014_04_090
crossref_primary_10_1007_s11263_024_02283_5
crossref_primary_10_1007_s00530_013_0351_z
crossref_primary_10_1109_TMM_2014_2314073
crossref_primary_10_1109_TCYB_2015_2403356
crossref_primary_10_1109_TIP_2014_2324291
crossref_primary_10_1007_s11042_017_5270_0
crossref_primary_10_1016_j_neucom_2015_09_118
crossref_primary_10_1016_j_neucom_2014_06_090
crossref_primary_10_1007_3DRes_04_2013_5
crossref_primary_10_1016_j_ins_2014_02_053
crossref_primary_10_1109_LSP_2012_2190060
crossref_primary_10_1109_TIP_2015_2395961
crossref_primary_10_1007_s11042_017_5271_z
crossref_primary_10_1016_j_neucom_2017_01_030
crossref_primary_10_1016_j_neucom_2016_05_053
crossref_primary_10_1155_2018_6310482
crossref_primary_10_1002_cav_2065
crossref_primary_10_1016_j_neucom_2015_05_048
crossref_primary_10_1016_j_ins_2014_03_016
crossref_primary_10_1016_j_ins_2015_03_032
crossref_primary_10_1007_s11042_023_15102_9
crossref_primary_10_1016_j_neucom_2016_06_095
crossref_primary_10_1007_s11042_013_1622_6
crossref_primary_10_1016_j_neucom_2015_10_005
crossref_primary_10_1016_j_neucom_2017_06_034
crossref_primary_10_1109_TNNLS_2016_2582532
crossref_primary_10_1007_s11042_013_1550_5
crossref_primary_10_1109_TMM_2013_2286580
crossref_primary_10_1109_TNNLS_2018_2851612
crossref_primary_10_1007_s11263_023_01905_8
crossref_primary_10_1016_j_sigpro_2014_07_025
crossref_primary_10_1007_s00530_014_0407_8
crossref_primary_10_1109_MMUL_2014_20
crossref_primary_10_1108_EC_10_2015_0315
crossref_primary_10_1109_ACCESS_2018_2886791
crossref_primary_10_1109_TIE_2014_2336602
crossref_primary_10_1007_s00530_015_0485_2
crossref_primary_10_1007_s11042_020_09252_3
crossref_primary_10_1109_TIP_2012_2199502
crossref_primary_10_1007_s11042_014_2267_9
crossref_primary_10_1007_s11042_015_2840_x
crossref_primary_10_1088_1742_6596_735_1_012007
crossref_primary_10_1016_j_neucom_2012_12_022
crossref_primary_10_1007_s11042_018_7102_2
crossref_primary_10_1109_TCSVT_2021_3091581
crossref_primary_10_1016_j_ins_2015_04_042
crossref_primary_10_1109_ACCESS_2018_2845362
crossref_primary_10_1016_j_ins_2014_03_079
crossref_primary_10_1016_j_sigpro_2014_09_038
crossref_primary_10_1016_j_ins_2014_08_017
crossref_primary_10_1016_j_ins_2015_01_027
crossref_primary_10_1109_TIP_2014_2343460
crossref_primary_10_1177_0954405414564233
crossref_primary_10_1109_TCYB_2017_2778764
crossref_primary_10_1016_j_neucom_2014_06_088
crossref_primary_10_1109_TMM_2024_3521706
crossref_primary_10_1016_j_compeleceng_2019_06_022
crossref_primary_10_1016_j_neucom_2014_06_089
crossref_primary_10_1007_s11042_017_4417_3
crossref_primary_10_1016_j_neucom_2014_06_086
crossref_primary_10_1016_j_neucom_2014_06_084
crossref_primary_10_1016_j_jvcir_2018_11_046
crossref_primary_10_1016_j_neucom_2014_06_085
crossref_primary_10_1016_j_neucom_2014_04_088
crossref_primary_10_1109_TIP_2014_2311658
crossref_primary_10_1109_TIP_2016_2609814
crossref_primary_10_1016_j_knosys_2015_10_019
crossref_primary_10_1145_3387920
crossref_primary_10_1007_s00530_022_00939_1
Cites_doi 10.1109/34.55109
10.1109/TVCG.2004.19
10.1109/TMM.2006.886359
10.1145/571647.571648
10.1016/j.neucom.2011.06.002
10.1109/CVPR.2011.5995344
10.1109/ICIP.2003.1247355
10.1109/CVPR.2003.1211497
10.1007/s11263-009-0281-6
10.1016/S0923-5965(00)00019-9
10.1145/566282.566322
10.1145/1126004.1126006
10.1007/s11263-009-0277-2
10.1142/S021946780300097X
10.1016/j.patcog.2006.04.034
10.1007/s11704-010-0366-y
10.1109/CVPR.2010.5540118
10.1145/588272.588279
10.1109/CVPR.2006.311
10.1016/j.patcog.2010.08.022
10.1109/TCSVT.2008.2002825
10.1109/TCSVT.2009.2017400
10.1109/34.765655
10.1016/j.patcog.2009.04.024
10.1109/ICPR.2002.1048337
10.1007/s11042-007-0181-0
10.1007/s11042-007-0188-6
10.1016/j.neucom.2009.11.050
10.1109/TMM.2009.2012919
10.1016/j.image.2009.11.001
10.1111/1467-8659.00669
10.1145/1873951.1874122
10.1145/1618452.1618505
10.1109/SMI.2008.4547955
10.1109/ICCV.2007.4408987
10.1007/s11263-009-0280-7
10.1016/j.image.2010.10.006
10.1109/TMM.2011.2160619
10.1007/s11042-009-0424-3
10.1016/S0923-5965(00)00020-5
10.1109/TSMCC.2007.905756
10.1016/j.patcog.2009.07.012
10.1007/s11263-011-0472-9
10.1109/TMM.2010.2055045
10.1109/LSP.2008.2010819
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1109/TIP.2011.2170081
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE/IET Electronic Library (IEL) (UW System Shared)
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList
MEDLINE - Academic
MEDLINE
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE/IET Electronic Library (IEL) (UW System Shared)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Applied Sciences
Engineering
EISSN 1941-0042
EndPage 2281
ExternalDocumentID 21965212
10_1109_TIP_2011_2170081
6030936
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID ---
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
HZ~
H~9
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
VH1
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
RIG
7X8
ID FETCH-LOGICAL-c360t-c0bb6fff245d708c7b2dd7298fca3caa3d2dcf387a260ffa5638f67a6c6ad08b3
IEDL.DBID RIE
ISICitedReferencesCount 196
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000302181800068&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1057-7149
1941-0042
IngestDate Sun Sep 28 05:59:52 EDT 2025
Mon Jul 21 05:58:02 EDT 2025
Tue Nov 18 22:18:53 EST 2025
Sat Nov 29 03:20:47 EST 2025
Tue Aug 26 16:57:54 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c360t-c0bb6fff245d708c7b2dd7298fca3caa3d2dcf387a260ffa5638f67a6c6ad08b3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
OpenAccessLink http://scholarbank.nus.edu.sg/handle/10635/43076
PMID 21965212
PQID 940837392
PQPubID 23479
PageCount 13
ParticipantIDs proquest_miscellaneous_940837392
pubmed_primary_21965212
crossref_citationtrail_10_1109_TIP_2011_2170081
ieee_primary_6030936
crossref_primary_10_1109_TIP_2011_2170081
PublicationCentury 2000
PublicationDate 2012-April
2012-4-00
2012-Apr
20120401
PublicationDateYYYYMMDD 2012-04-01
PublicationDate_xml – month: 04
  year: 2012
  text: 2012-April
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle IEEE transactions on image processing
PublicationTitleAbbrev TIP
PublicationTitleAlternate IEEE Trans Image Process
PublicationYear 2012
Publisher IEEE
Publisher_xml – name: IEEE
References ref57
ref13
ref56
ref12
ohbuchi (ref26) 2008
ref15
ref14
ref52
ref55
ref11
ref54
ref10
ref17
ref16
ref18
ref51
vranic (ref41) 2004
furuya (ref35) 2008
su (ref47) 2009
ref45
ref48
paquet (ref19) 0
ref42
ref44
ref43
ref49
ref8
ref7
ref9
ref4
ohbuchi (ref33) 2008
ref6
ref5
ref40
ref34
ref37
ref36
ref31
ref30
ref32
gao (ref24) 2010
ref2
leng (ref20) 2009; 18
ref1
ref39
(ref53) 1999
ref38
ref23
ref25
ref21
yang (ref3) 2007; 37
ref28
ref27
sun (ref46) 2009
ref29
steinbach (ref50) 2000
gao (ref22) 2010
References_xml – ident: ref48
  doi: 10.1109/34.55109
– year: 2004
  ident: ref41
  publication-title: 3D Model Retrieval
– ident: ref18
  doi: 10.1109/TVCG.2004.19
– ident: ref42
  doi: 10.1109/TMM.2006.886359
– ident: ref11
  doi: 10.1145/571647.571648
– ident: ref7
  doi: 10.1016/j.neucom.2011.06.002
– ident: ref28
  doi: 10.1109/CVPR.2011.5995344
– ident: ref52
  doi: 10.1109/ICIP.2003.1247355
– ident: ref51
  doi: 10.1109/CVPR.2003.1211497
– ident: ref38
  doi: 10.1007/s11263-009-0281-6
– ident: ref49
  doi: 10.1016/S0923-5965(00)00019-9
– ident: ref10
  doi: 10.1145/566282.566322
– ident: ref1
  doi: 10.1145/1126004.1126006
– year: 1999
  ident: ref53
  publication-title: Description of Core Experiments for MPEG-7 Color/Texture Descriptors
– ident: ref34
  doi: 10.1007/s11263-009-0277-2
– ident: ref14
  doi: 10.1142/S021946780300097X
– start-page: 1
  year: 2000
  ident: ref50
  article-title: A comparison of document clustering techniques
  publication-title: Proc KDD Workshop Text Mining
– ident: ref32
  doi: 10.1016/j.patcog.2006.04.034
– ident: ref6
  doi: 10.1007/s11704-010-0366-y
– ident: ref29
  doi: 10.1109/CVPR.2010.5540118
– ident: ref15
  doi: 10.1145/588272.588279
– ident: ref44
  doi: 10.1109/CVPR.2006.311
– ident: ref30
  doi: 10.1016/j.patcog.2010.08.022
– ident: ref8
  doi: 10.1109/TCSVT.2008.2002825
– ident: ref56
  doi: 10.1109/TCSVT.2009.2017400
– ident: ref16
  doi: 10.1109/34.765655
– ident: ref12
  doi: 10.1016/j.patcog.2009.04.024
– ident: ref39
  doi: 10.1109/ICPR.2002.1048337
– ident: ref2
  doi: 10.1007/s11042-007-0181-0
– start-page: 1711
  year: 2010
  ident: ref24
  article-title: Intelligent query: Open another door to 3-D object retrieval
  publication-title: Proc ACM Conf Multimedia
– start-page: 213
  year: 2009
  ident: ref47
  article-title: Learning a dense multi-view representation for detection, viewpoint classification and synthesis of object categories
  publication-title: Proc IEEE Int Conf Comput Vis
– ident: ref9
  doi: 10.1007/s11042-007-0188-6
– ident: ref43
  doi: 10.1016/j.neucom.2009.11.050
– volume: 18
  start-page: 291
  year: 2009
  ident: ref20
  article-title: Mate: A visual based 3-D shape descriptor
  publication-title: Chin J Electron
– start-page: 63
  year: 2008
  ident: ref33
  article-title: Scale-weighted dense bag of visual features for 3-D model retrieval from a partial view 3-D model
  publication-title: Proc IEEE ICCV Workshop S3DV
– ident: ref55
  doi: 10.1109/TMM.2009.2012919
– ident: ref23
  doi: 10.1016/j.image.2009.11.001
– ident: ref31
  doi: 10.1111/1467-8659.00669
– ident: ref36
  doi: 10.1145/1873951.1874122
– ident: ref27
  doi: 10.1145/1618452.1618505
– ident: ref25
  doi: 10.1109/SMI.2008.4547955
– start-page: 947
  year: 2010
  ident: ref22
  article-title: Representative views re-ranking for 3-D model retrieval with multi-bipartite graph reinforcement model
  publication-title: Proc ACM Conf Multimedia
– start-page: 1247
  year: 2009
  ident: ref46
  article-title: A multi-view probabilistic model for 3-D object classes
  publication-title: Proc Comput Vis Pattern Recognit
– ident: ref45
  doi: 10.1109/ICCV.2007.4408987
– start-page: 26
  year: 2008
  ident: ref35
  article-title: Dense sampling and fast encoding for 3-D model retrieval using bag-of-visual features
  publication-title: Proceedings of International Conference on Image Video Retrieval
– ident: ref17
  doi: 10.1007/s11263-009-0280-7
– ident: ref40
  doi: 10.1016/j.image.2010.10.006
– ident: ref5
  doi: 10.1109/TMM.2011.2160619
– ident: ref4
  doi: 10.1007/s11042-009-0424-3
– ident: ref13
  doi: 10.1016/S0923-5965(00)00020-5
– volume: 37
  start-page: 1081
  year: 2007
  ident: ref3
  article-title: Content-based 3-d model retrieval: A survey
  publication-title: IEEE Trans Syst Man Cybern C Appl Rev
  doi: 10.1109/TSMCC.2007.905756
– ident: ref37
  doi: 10.1016/j.patcog.2009.07.012
– ident: ref54
  doi: 10.1007/s11263-011-0472-9
– start-page: 22
  year: 2008
  ident: ref26
  article-title: Accelerating bag-of-features sift algorithm for 3-D model retrieval
  publication-title: Proc SAMT Workshop Semantic 3-D Media
– year: 0
  ident: ref19
– ident: ref57
  doi: 10.1109/TMM.2010.2055045
– ident: ref21
  doi: 10.1109/LSP.2008.2010819
SSID ssj0014516
Score 2.5054832
Snippet Recently, extensive research efforts have been dedicated to view-based methods for 3-D object retrieval due to the highly discriminative property of multiviews...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 2269
SubjectTerms 3-D object
Algorithms
Arrays
Camera constraint-free
Cameras
Computational modeling
Educational institutions
Image Enhancement - methods
Image Interpretation, Computer-Assisted - methods
Imaging, Three-Dimensional - methods
Information Storage and Retrieval - methods
Pattern Recognition, Automated - methods
Photography - methods
Reproducibility of Results
retrieval
Sensitivity and Specificity
Signal Processing, Computer-Assisted
Solid modeling
Subtraction Technique
Three dimensional displays
view-based
Title Camera Constraint-Free View-Based 3-D Object Retrieval
URI https://ieeexplore.ieee.org/document/6030936
https://www.ncbi.nlm.nih.gov/pubmed/21965212
https://www.proquest.com/docview/940837392
Volume 21
WOSCitedRecordID wos000302181800068&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE/IET Electronic Library (IEL) (UW System Shared)
  customDbUrl:
  eissn: 1941-0042
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014516
  issn: 1057-7149
  databaseCode: RIE
  dateStart: 19920101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1ZS8QwEB7WxQd98D7WY8mDL4Jxa49J-ui1KMgqorJvJc0BC9KVPfTvm6QHCir41ockbebofJnJzAAcKYNGcsHt2QQFjY0WVDDUVDGTsyDRgiufKHzHBgM-HKYPLThpcmG01v7ymT51jz6Wr8Zy7lxlPfRxO1yABcawzNVqIgau4ayPbCaMMgv765BkkPaebh_KWp2hK0bHz3wB4BRd1uo3a-Tbq_yONL3F6a_-71vXYKVCluS8FIV1aOliA1YrlEkqHZ5uwPKXEoSbgJfCuaWIa9zp20XMaH-iNXkZ6Q96YU2cIhG9Ive589eQR99_ywrnFjz3r58ub2jVS4HKCIMZlUGeozEmjBPFAi5ZHiplgTU3UkRSiEiFSpqIM2EPOMaIxOqlQSZQolABz6NtaBfjQu8Csfs6C1EbgzqJMUyFsf-oWKkAZSwNkx3o1TTNZFVo3G3gNfMHjiDNLEMyx5CsYkgHjpsZb2WRjT_GbjpiN-MqOneA1GzLrIK4qIco9Hg-zdLYokxmYWAHdkp2NnNrKdj7ec19WLJvDst7OgfQnk3m-hAW5ftsNJ10rRAOedcL4ScmCdV3
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1ZSwMxEB5qFdQHj3rVMw--CMbGPZLso1dpsdYiVXxbsjlAkFZ66N83yW4XBRV824ckJJmZnW9mMjMAx8pQI7ng1jahAkdGCywY1VgxkzESa8GVTxTusG6XPz8nvQqclrkwWmv_-EyfuU8fy1dDOXWusgb1cTs6B_NxFAUkz9YqYwau5ayPbcYMMwv8Z0FJkjT67V5erTNw5ej4uS8BnFCXt_pNH_kGK79jTa9zmqv_2-0arBTYEl3kzLAOFT2owWqBM1EhxeMaLH8pQrgB9Eo4xxRyrTt9w4gJbo60Rk8v-gNfWiWnUIiv0X3mPDbowXfgsuy5CY_Nm_5VCxfdFLAMKZlgSbKMGmOCKFaMcMmyQCkLrbmRIpRChCpQ0oScCWviGCNiK5mGMkElFYrwLNyC6mA40DuA7LnOA6qNoTqOaJAIY_9SkVKEykgaJuvQmN1pKotS4-4Ar6k3OUiSWoKkjiBpQZA6nJQz3vIyG3-M3XCXXY4r7rkOaEa21IqIi3uIgR5Ox2kSWZzJLBCsw3ZOznLujAt2f17zCBZb_btO2ml3b_dgye4iyF_t7EN1MprqA1iQ75OX8ejQs-InZN3X1g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Camera+Constraint-Free+View-Based+3-D+Object+Retrieval&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Gao%2C+Yue&rft.au=Tang%2C+Jinhui&rft.au=Hong%2C+Richang&rft.au=Yan%2C+Shuicheng&rft.date=2012-04-01&rft.issn=1057-7149&rft.eissn=1941-0042&rft.volume=21&rft.issue=4&rft.spage=2269&rft.epage=2281&rft_id=info:doi/10.1109%2FTIP.2011.2170081&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TIP_2011_2170081
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon