Source-Free Progressive Graph Learning for Open-Set Domain Adaptation
Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from irrelevant target classes not present in the source data. However, most OSDA approaches are limited due to the lack of essential theoretical...
Uloženo v:
| Vydáno v: | IEEE transactions on pattern analysis and machine intelligence Ročník 45; číslo 9; s. 11240 - 11255 |
|---|---|
| Hlavní autoři: | , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
United States
IEEE
01.09.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 0162-8828, 1939-3539, 2160-9292, 1939-3539 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from irrelevant target classes not present in the source data. However, most OSDA approaches are limited due to the lack of essential theoretical analysis of generalization bound, reliance on the coexistence of source and target data during adaptation, and failure to accurately estimate model predictions' uncertainty. To address these limitations, the Progressive Graph Learning (PGL) framework is proposed. PGL decomposes the target hypothesis space into shared and unknown subspaces and progressively pseudo-labels the most confident known samples from the target domain for hypothesis adaptation. PGL guarantees a tight upper bound of the target error by integrating a graph neural network with episodic training and leveraging adversarial learning to close the gap between the source and target distributions. The proposed approach also tackles a more realistic source-free open-set domain adaptation (SF-OSDA) setting that makes no assumptions about the coexistence of source and target domains. In a two-stage framework, the SF-PGL model' uniformly selects the most confident target instances from each category at a fixed ratio, and the confidence thresholds in each class weigh the classification loss in the adaptation step. The proposed methods are evaluated on benchmark image classification and action recognition datasets, where they demonstrate superiority and flexibility in recognizing both shared and unknown categories. Additionally, balanced pseudo-labeling plays a significant role in improving calibration, making the trained model less prone to over- or under-confident predictions on the target data. |
|---|---|
| AbstractList | Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from irrelevant target classes not present in the source data. However, most OSDA approaches are limited due to the lack of essential theoretical analysis of generalization bound, reliance on the coexistence of source and target data during adaptation, and failure to accurately estimate model predictions' uncertainty. To address these limitations, the Progressive Graph Learning (PGL) framework is proposed. PGL decomposes the target hypothesis space into shared and unknown subspaces and progressively pseudo-labels the most confident known samples from the target domain for hypothesis adaptation. PGL guarantees a tight upper bound of the target error by integrating a graph neural network with episodic training and leveraging adversarial learning to close the gap between the source and target distributions. The proposed approach also tackles a more realistic source-free open-set domain adaptation (SF-OSDA) setting that makes no assumptions about the coexistence of source and target domains. In a two-stage framework, the SF-PGL model' uniformly selects the most confident target instances from each category at a fixed ratio, and the confidence thresholds in each class weigh the classification loss in the adaptation step. The proposed methods are evaluated on benchmark image classification and action recognition datasets, where they demonstrate superiority and flexibility in recognizing both shared and unknown categories. Additionally, balanced pseudo-labeling plays a significant role in improving calibration, making the trained model less prone to over- or under-confident predictions on the target data. Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from irrelevant target classes not present in the source data. However, most OSDA approaches are limited due to the lack of essential theoretical analysis of generalization bound, reliance on the coexistence of source and target data during adaptation, and failure to accurately estimate model predictions' uncertainty. To address these limitations, the Progressive Graph Learning (PGL) framework is proposed. PGL decomposes the target hypothesis space into shared and unknown subspaces and progressively pseudo-labels the most confident known samples from the target domain for hypothesis adaptation. PGL guarantees a tight upper bound of the target error by integrating a graph neural network with episodic training and leveraging adversarial learning to close the gap between the source and target distributions. The proposed approach also tackles a more realistic source-free open-set domain adaptation (SF-OSDA) setting that makes no assumptions about the coexistence of source and target domains. In a two-stage framework, the SF-PGL model' uniformly selects the most confident target instances from each category at a fixed ratio, and the confidence thresholds in each class weigh the classification loss in the adaptation step. The proposed methods are evaluated on benchmark image classification and action recognition datasets, where they demonstrate superiority and flexibility in recognizing both shared and unknown categories. Additionally, balanced pseudo-labeling plays a significant role in improving calibration, making the trained model less prone to over- or under-confident predictions on the target data.Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from irrelevant target classes not present in the source data. However, most OSDA approaches are limited due to the lack of essential theoretical analysis of generalization bound, reliance on the coexistence of source and target data during adaptation, and failure to accurately estimate model predictions' uncertainty. To address these limitations, the Progressive Graph Learning (PGL) framework is proposed. PGL decomposes the target hypothesis space into shared and unknown subspaces and progressively pseudo-labels the most confident known samples from the target domain for hypothesis adaptation. PGL guarantees a tight upper bound of the target error by integrating a graph neural network with episodic training and leveraging adversarial learning to close the gap between the source and target distributions. The proposed approach also tackles a more realistic source-free open-set domain adaptation (SF-OSDA) setting that makes no assumptions about the coexistence of source and target domains. In a two-stage framework, the SF-PGL model' uniformly selects the most confident target instances from each category at a fixed ratio, and the confidence thresholds in each class weigh the classification loss in the adaptation step. The proposed methods are evaluated on benchmark image classification and action recognition datasets, where they demonstrate superiority and flexibility in recognizing both shared and unknown categories. Additionally, balanced pseudo-labeling plays a significant role in improving calibration, making the trained model less prone to over- or under-confident predictions on the target data. |
| Author | Luo, Yadan Chen, Zhuoxiao Huang, Zi Wang, Zijian Baktashmotlagh, Mahsa |
| Author_xml | – sequence: 1 givenname: Yadan orcidid: 0000-0001-6272-2971 surname: Luo fullname: Luo, Yadan email: y.luo@uq.edu.au organization: School of Information Technology and Electrical Engineering, The University of Queensland, St Lucia, QLD, Australia – sequence: 2 givenname: Zijian orcidid: 0000-0002-7190-9620 surname: Wang fullname: Wang, Zijian email: zijian.wang@uq.edu.au organization: School of Information Technology and Electrical Engineering, The University of Queensland, St Lucia, QLD, Australia – sequence: 3 givenname: Zhuoxiao orcidid: 0000-0001-5247-0109 surname: Chen fullname: Chen, Zhuoxiao email: zhuoxiao.chen@uq.edu.au organization: School of Information Technology and Electrical Engineering, The University of Queensland, St Lucia, QLD, Australia – sequence: 4 givenname: Zi orcidid: 0000-0002-9738-4949 surname: Huang fullname: Huang, Zi email: helen.huang@uq.edu.au organization: School of Information Technology and Electrical Engineering, The University of Queensland, St Lucia, QLD, Australia – sequence: 5 givenname: Mahsa orcidid: 0000-0001-5255-8194 surname: Baktashmotlagh fullname: Baktashmotlagh, Mahsa email: m.baktashmotlagh@uq.edu.au organization: School of Information Technology and Electrical Engineering, The University of Queensland, St Lucia, QLD, Australia |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37097801$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kUFPwzAMhSMEYmPwBxBClbhw6XCSNkmP02ADaQgk4FxlrQuZtqQkHRL_no4OhHbgYl--Z_v5HZF96ywSckphSClkV8-Po_u7IQPGh5xJYErtkT6jAuKMZWyf9IEKFivFVI8chbAAoEkK_JD0uIRMKqB9cvPk1r7AeOIRo0fvXj2GYD4wmnpdv0Uz1N4a-xpVzkcPNdr4CZvo2q20sdGo1HWjG-PsMTmo9DLgybYPyMvk5nl8G88epnfj0SwueMqamJZagtaAbWEMqRaKCpyXSSYqSRVPkCcZykSUFAsBZapkAmnJqjlUNGEFH5DLbm7t3fsaQ5OvTChwudQW3TrkTIGAjTfeohc76KJ1atvrWiqRWSpFmrbU-ZZaz1dY5rU3K-0_858HtQDrgMK7EDxWvwiFfJNC_p1Cvkkh36bQitSOqDDdoxqvzfJ_6VknNYj4ZxcFmYHgX95AkoA |
| CODEN | ITPIDJ |
| CitedBy_id | crossref_primary_10_1016_j_aei_2025_103220 crossref_primary_10_1109_TPAMI_2024_3370978 crossref_primary_10_1016_j_patcog_2024_110628 crossref_primary_10_1007_s11263_024_02213_5 crossref_primary_10_1007_s10489_025_06777_8 crossref_primary_10_1016_j_patcog_2025_111815 crossref_primary_10_1109_TMI_2024_3355645 crossref_primary_10_1016_j_knosys_2025_113576 crossref_primary_10_1109_TIP_2025_3534023 crossref_primary_10_1016_j_neunet_2024_106230 crossref_primary_10_1109_TCSVT_2024_3427428 crossref_primary_10_1016_j_eswa_2025_129633 crossref_primary_10_1016_j_neunet_2025_107633 |
| Cites_doi | 10.1109/CVPR.2016.90 10.1109/CVPR.2019.00049 10.1109/ICCV.2011.6126543 10.1109/TGRS.2022.3165025 10.1145/1102351.1102430 10.1109/ICCV.2019.00808 10.1109/ICCV.2013.274 10.1109/CVPR.2018.00392 10.1109/CVPR.2019.00283 10.1109/ICCV.2019.00642 10.1007/978-3-319-10578-9_26 10.1109/TNNLS.2020.3017213 10.1023/A:1019956318069 10.1145/3394171.3413662 10.1109/TCYB.2018.2820174 10.1109/ICCV.2019.00370 10.1109/CVPR.2019.00310 10.1109/CVPR.2019.00010 10.1145/1553374.1553380 10.1145/3394171.3413897 10.1109/CVPR42600.2020.01388 10.1109/CVPR.2018.00851 10.1109/TKDE.2009.191 10.1109/CVPR.2017.572 10.1609/aaai.v35i9.16977 10.1007/978-3-030-01228-1_10 10.1007/978-3-642-15552-9_29 10.1109/ICCV.2019.00153 10.1007/s10994-016-5610-8 10.1109/TPAMI.2018.2880750 10.1007/978-3-319-46493-0_36 10.1109/ICCV.2013.100 10.1109/CVPR.2014.318 10.1007/978-3-030-58517-4_25 10.1609/aaai.v34i04.5942 10.1016/j.patcog.2018.03.005 10.1109/ICCV.2017.88 10.1109/WACV48630.2021.00066 10.1109/CVPR.2019.00943 10.1109/CVPR.2017.316 10.1109/CVPR.2019.00304 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| DOI | 10.1109/TPAMI.2023.3270288 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | PubMed MEDLINE - Academic Technology Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science |
| EISSN | 2160-9292 1939-3539 |
| EndPage | 11255 |
| ExternalDocumentID | 37097801 10_1109_TPAMI_2023_3270288 10107906 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: Australian Research Council grantid: CE200100025 funderid: 10.13039/501100000923 |
| GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ IEDLZ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 UHB ~02 AAYXX CITATION 5VS 9M8 AAYOK ABFSI ADRHT AETEA AETIX AGSQL AI. AIBXA ALLEH FA8 H~9 IBMZZ ICLAB IFJZH NPM RIG RNI RZB VH1 XJT 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| ID | FETCH-LOGICAL-c352t-1da70aa0e0aa22e1a6816ebd496f71834e349e746d1ec60d587405d2fb0f142c3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 26 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001045832200041&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0162-8828 1939-3539 |
| IngestDate | Sun Sep 28 07:56:37 EDT 2025 Sun Nov 30 03:50:48 EST 2025 Thu Apr 03 07:04:19 EDT 2025 Sat Nov 29 02:58:23 EST 2025 Tue Nov 18 22:30:52 EST 2025 Wed Aug 27 02:46:11 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 9 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c352t-1da70aa0e0aa22e1a6816ebd496f71834e349e746d1ec60d587405d2fb0f142c3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0001-6272-2971 0000-0001-5247-0109 0000-0002-7190-9620 0000-0001-5255-8194 0000-0002-9738-4949 |
| PMID | 37097801 |
| PQID | 2847957655 |
| PQPubID | 85458 |
| PageCount | 16 |
| ParticipantIDs | crossref_citationtrail_10_1109_TPAMI_2023_3270288 proquest_miscellaneous_2806070973 pubmed_primary_37097801 ieee_primary_10107906 crossref_primary_10_1109_TPAMI_2023_3270288 proquest_journals_2847957655 |
| PublicationCentury | 2000 |
| PublicationDate | 2023-09-01 |
| PublicationDateYYYYMMDD | 2023-09-01 |
| PublicationDate_xml | – month: 09 year: 2023 text: 2023-09-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
| PublicationTitleAbbrev | TPAMI |
| PublicationTitleAlternate | IEEE Trans Pattern Anal Mach Intell |
| PublicationYear | 2023 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref12 simonyan (ref84) 2015 ref14 snell (ref68) 2017 acar (ref47) 2021 ref55 baktashmotlagh (ref20) 2019 ref10 peng (ref70) 2017 long (ref11) 2017 ref19 ref18 rusu (ref50) 2019 ben-david (ref15) 2006 oreshkin (ref57) 2018 finn (ref48) 2017 ref45 ravi (ref52) 2017 lin (ref71) 2014 long (ref77) 2016 ref41 ref85 ref43 zhang (ref17) 2019 mansour (ref16) 2009 gretton (ref66) 2006 vinyals (ref42) 2016 ref9 ref3 chenluo (ref34) 2021 ref6 peng (ref72) 2018 ref82 ref81 ref40 ref83 hospedales (ref44) 2022; 44 ganin (ref8) 2016; 17 johnson (ref58) 2017 rajeswaran (ref51) 2019 ref80 baktashmotlagh (ref4) 2016; 17 ref35 ref79 ref78 luo (ref28) 2020 snell (ref53) 2017 ref31 ref75 ref30 ref74 ref33 ref32 ref76 zhao (ref23) 2019 finn (ref49) 2018 finn (ref46) 2019 ref2 ref1 satorras (ref54) 2018 ref39 guo (ref24) 2017 long (ref7) 2015 yang (ref61) 2020 park (ref25) 2020 kundu (ref36) 2020 ref69 liang (ref38) 2020 ref64 ref22 ref21 ref65 brockschmidt (ref63) 2020 wang (ref26) 2020 ref27 ref29 tzeng (ref5) 2014 soomro (ref73) 2012 zhang (ref56) 2018 vinyals (ref67) 2016 ref60 kim (ref59) 2011 kundu (ref37) 2020 ref62 |
| References_xml | – ident: ref76 doi: 10.1109/CVPR.2016.90 – year: 2018 ident: ref72 article-title: Syn2Real: A new benchmark for synthetic-to-real visual domain adaptation – start-page: 4080 year: 2017 ident: ref68 article-title: Prototypical networks for few-shot learning publication-title: Proc Adv Neural Inf Process Syst – start-page: 7404 year: 2019 ident: ref17 article-title: Bridging theory and algorithm for domain adaptation publication-title: Proc Int Conf Mach Learn – ident: ref55 doi: 10.1109/CVPR.2019.00049 – ident: ref74 doi: 10.1109/ICCV.2011.6126543 – ident: ref64 doi: 10.1109/TGRS.2022.3165025 – ident: ref85 doi: 10.1145/1102351.1102430 – ident: ref19 doi: 10.1109/ICCV.2019.00808 – start-page: 1920 year: 2019 ident: ref46 article-title: Online meta-learning publication-title: Proc Int Conf Mach Learn – ident: ref6 doi: 10.1109/ICCV.2013.274 – year: 2014 ident: ref5 article-title: Deep domain confusion: Maximizing for domain invariance – start-page: 7523 year: 2019 ident: ref23 article-title: On learning invariant representations for domain adaptation publication-title: Proc Int Conf Mach Learn – year: 2009 ident: ref16 article-title: Domain adaptation: Learning bounds and algorithms publication-title: Proc Conf Learn Theory – volume: 17 start-page: 59:1 year: 2016 ident: ref8 article-title: Domain-adversarial training of neural networks publication-title: J Mach Learn Res – start-page: 97 year: 2015 ident: ref7 article-title: Learning transferable features with deep adaptation networks publication-title: Proc Int Conf Mach Learn – ident: ref78 doi: 10.1109/CVPR.2018.00392 – start-page: 513 year: 2006 ident: ref66 article-title: A kernel method for the two-sample-problem publication-title: Proc Adv Neural Inf Process Syst – start-page: 9537 year: 2018 ident: ref49 article-title: Probabilistic model-agnostic meta-learning publication-title: Proc Adv Neural Inf Process Syst – year: 2012 ident: ref73 article-title: UCF101: A dataset of 101 human actions classes from videos in the wild – ident: ref35 doi: 10.1109/CVPR.2019.00283 – ident: ref1 doi: 10.1109/ICCV.2019.00642 – ident: ref32 doi: 10.1007/978-3-319-10578-9_26 – ident: ref65 doi: 10.1109/TNNLS.2020.3017213 – ident: ref43 doi: 10.1023/A:1019956318069 – ident: ref14 doi: 10.1145/3394171.3413662 – start-page: 137 year: 2006 ident: ref15 article-title: Analysis of representations for domain adaptation publication-title: Proc Adv Neural Inf Process Syst – start-page: 3219 year: 2020 ident: ref25 article-title: Calibrated prediction with covariate shift via unsupervised domain adaptation publication-title: Proc Int Conf Artif Intell Statist – ident: ref10 doi: 10.1109/TCYB.2018.2820174 – ident: ref41 doi: 10.1109/ICCV.2019.00370 – ident: ref81 doi: 10.1109/CVPR.2019.00310 – start-page: 1321 year: 2017 ident: ref24 article-title: On calibration of modern neural networks publication-title: Proc 34th Int Conf Mach Learn – ident: ref62 doi: 10.1109/CVPR.2019.00010 – ident: ref27 doi: 10.1145/1553374.1553380 – ident: ref13 doi: 10.1145/3394171.3413897 – start-page: 4543 year: 2020 ident: ref37 article-title: Universal source-free domain adaptation publication-title: Proc IEEE Conf Comput Vis Pattern Recognit – ident: ref82 doi: 10.1109/CVPR42600.2020.01388 – start-page: 136 year: 2016 ident: ref77 article-title: Unsupervised domain adaptation with residual transfer networks publication-title: Proc Adv Neural Inf Process Syst – start-page: 1144 year: 2020 ident: ref63 article-title: GNN-Film: Graph neural networks with feature-wise linear modulation publication-title: Proc Int Conf Mach Learn – ident: ref80 doi: 10.1109/CVPR.2018.00851 – start-page: 3630 year: 2016 ident: ref67 article-title: Matching networks for one shot learning publication-title: Proc Adv Neural Inf Process Syst – start-page: 2208 year: 2017 ident: ref11 article-title: Deep transfer learning with joint adaptation networks publication-title: Proc Int Conf Mach Learn – year: 2019 ident: ref50 article-title: Meta-learning with latent embedding optimization publication-title: Proc Int Conf Learn Representations – ident: ref31 doi: 10.1109/TKDE.2009.191 – ident: ref69 doi: 10.1109/CVPR.2017.572 – ident: ref22 doi: 10.1609/aaai.v35i9.16977 – start-page: 2371 year: 2018 ident: ref56 article-title: MetaGAN: An adversarial approach to few-shot learning publication-title: Proc Adv Neural Inf Process Syst – start-page: 12373 year: 2020 ident: ref36 article-title: Towards inheritable models for open-set domain adaptation publication-title: Proc IEEE Conf Comput Vis Pattern Recognit – ident: ref18 doi: 10.1007/978-3-030-01228-1_10 – start-page: 32 year: 2021 ident: ref47 article-title: Memory efficient online meta learning publication-title: Proc Int Conf Mach Learn – year: 2015 ident: ref84 article-title: Very deep convolutional networks for large-scale image recognition publication-title: Proc Int Conf Learn Representations – ident: ref75 doi: 10.1007/978-3-642-15552-9_29 – start-page: 6028 year: 2020 ident: ref38 article-title: Do we really need to access the source data? source hypothesis transfer for unsupervised domain adaptation publication-title: Proc Int Conf Mach Learn – ident: ref40 doi: 10.1109/ICCV.2019.00153 – ident: ref83 doi: 10.1007/s10994-016-5610-8 – year: 2017 ident: ref70 article-title: VisDA: The visual domain adaptation challenge – start-page: 1530 year: 2011 ident: ref59 article-title: Higher-order correlation clustering for image segmentation publication-title: Proc Adv Neural Inf Process Syst – ident: ref30 doi: 10.1109/TPAMI.2018.2880750 – year: 2018 ident: ref54 article-title: Few-shot learning with graph neural networks publication-title: Proc Int Conf Learn Representations – year: 2017 ident: ref52 article-title: Optimization as a model for few-shot learning publication-title: Proc Int Conf Learn Representations – ident: ref12 doi: 10.1007/978-3-319-46493-0_36 – start-page: 593 year: 2020 ident: ref61 article-title: NENN: Incorporate node and edge features in graph neural networks publication-title: Proc 12th Asian Conf Mach Learn – start-page: 19212 year: 2020 ident: ref26 article-title: Transferable calibration with lower bias and variance in domain adaptation publication-title: Proc Adv Neural Inf Process Syst – start-page: 1126 year: 2017 ident: ref48 article-title: Model-agnostic meta-learning for fast adaptation of deep networks publication-title: Proc Int Conf Mach Learn – ident: ref2 doi: 10.1109/ICCV.2013.100 – year: 2019 ident: ref20 article-title: Learning factorized representations for open-set domain adaptation publication-title: Proc Int Conf Learn Representations – ident: ref3 doi: 10.1109/CVPR.2014.318 – ident: ref33 doi: 10.1007/978-3-030-58517-4_25 – ident: ref45 doi: 10.1609/aaai.v34i04.5942 – start-page: 4077 year: 2017 ident: ref53 article-title: Prototypical networks for few-shot learning publication-title: Proc Adv Neural Inf Process Syst – ident: ref79 doi: 10.1016/j.patcog.2018.03.005 – start-page: 740 year: 2014 ident: ref71 article-title: Microsoft COCO: Common objects in context publication-title: Proc Eur Conf Comput Vis – ident: ref29 doi: 10.1109/ICCV.2017.88 – volume: 17 start-page: 108:1 year: 2016 ident: ref4 article-title: Distribution-matching embedding for visual domain adaptation publication-title: J Mach Learn Res – volume: 44 start-page: 5149 year: 2022 ident: ref44 article-title: Meta-learning in neural networks: A survey publication-title: IEEE Trans Pattern Anal Mach Intell – start-page: 20:1 year: 2021 ident: ref34 article-title: Conditional extreme value theory for open set video domain adaptation publication-title: Proc ACM Multimedia Asia – start-page: 3630 year: 2016 ident: ref42 article-title: Matching networks for one shot learning publication-title: Proc Adv Neural Inf Process Syst – ident: ref39 doi: 10.1109/WACV48630.2021.00066 – ident: ref60 doi: 10.1109/CVPR.2019.00943 – year: 2017 ident: ref58 article-title: Learning graphical state transitions publication-title: Proc Int Conf Learn Representations – ident: ref9 doi: 10.1109/CVPR.2017.316 – ident: ref21 doi: 10.1109/CVPR.2019.00304 – start-page: 6468 year: 2020 ident: ref28 article-title: Progressive graph learning for open-set domain adaptation publication-title: Proc Int Conf Mach Learn – start-page: 113 year: 2019 ident: ref51 article-title: Meta-learning with implicit gradients publication-title: Proc Adv Neural Inf Process Syst – start-page: 719 year: 2018 ident: ref57 article-title: TADAM: Task dependent adaptive metric for improved few-shot learning publication-title: Proc Adv Neural Inf Process Syst |
| SSID | ssj0014503 |
| Score | 2.5967898 |
| Snippet | Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 11240 |
| SubjectTerms | Action recognition Adaptation Adaptation models domain adaptation Graph neural networks Hypotheses Image classification Knowledge management Labels Machine learning open-set domain adaptation Predictive models Semantics source-free domain adaptation Subspaces Task analysis Training Uncertainty Upper bounds |
| Title | Source-Free Progressive Graph Learning for Open-Set Domain Adaptation |
| URI | https://ieeexplore.ieee.org/document/10107906 https://www.ncbi.nlm.nih.gov/pubmed/37097801 https://www.proquest.com/docview/2847957655 https://www.proquest.com/docview/2806070973 |
| Volume | 45 |
| WOSCitedRecordID | wos001045832200041&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2160-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014503 issn: 0162-8828 databaseCode: RIE dateStart: 19790101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dT9swED8xxMP2MD7GIBtUnrQ35M75suPHClrggaoSIPUtcpzLhLSlqLT8_fM5TtUXJvESRYqTWL47-c539_sB_NTksWET88Qmkmd1XvGiqCy3eSWaSjZSKOvJJtR0Wsznehaa1X0vDCL64jMc0q3P5dcLu6ajMmfhLljRBLD9QSnZNWttUgZZ7mmQnQvjTNzFEX2HjNC_Hmaju9shEYUPU-q_KoilL1XUwhDIYPoNyTOsvO1s-k1nsv_O6R7A5-BdslGnDoewg-0R7PfMDSwY8hF82oIh_ALje3-CzydLRDajgi2qjX1Fdk1o1ixAsP5mzr9lVIDC73HFrhZ_zVPLRrV57rL5x_A4GT9c3vBAr8Ct87pWPK6NEsYIdJckwdjIIpZY1ZmWjdux0gzTTKPKZB2jlaLOib0vr5PGSTHOEpt-hd120eIpsDi1jarc0hsX5EosdC4MId-YFI1KUUUQ92tc2oA9ThQYf0ofgwhdehGVJKIyiCiCi807zx3yxn9HH5MAtkZ2ax_BWS_LMljnS0lbsnaBVp5H8GPz2NkVJUtMi4s1jRFOTwnMKIKTTgc2H-9V59sbP_0OH2luXSnaGeyulms8hz37unp6WQ6c8s6LgVfefwze5d8 |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Ra9swED5GN9j6sG5d23lrNw32NpTKtiRbj2Ft1rI2BJpB34wsn0thc0qa9PdPJ8uhLx3sxRgs20J3h-50d98H8NWQx4ZtyjOXaS4bVfOyrB13qhZtrVstChfIJorptLy-NrPYrB56YRAxFJ_hiG5DLr9ZuDUdlXkL98GKIYDt50rKTPTtWpukgVSBCNk7Md7IfSQx9MgIczyfjS_PR0QVPsqpA6sknr68oCaGSAczbEmBY-VpdzNsO5Od_5zwG3gd_Us27hXiLTzDbhd2Bu4GFk15F7YfARG-g9OrcIbPJ0tENqOSLaqOfUD2g_CsWQRhvWHew2VUgsKvcMVOFn_sbcfGjb3r8_l78GtyOv9-xiPBAnfe71rxtLGFsFagv2QZplaXqca6kUa3fs_KJebSYCF1k6LTolHE36earPVyTGXm8n3Y6hYdvgeW5q4tar_01oe5GkujhCXsG5ujLXIsEkiHNa5cRB8nEozfVYhChKmCiCoSURVFlMC3zTt3PfbGP0fvkQAejezXPoHDQZZVtM_7ijZl40MtpRL4snnsLYvSJbbDxZrGCK-pBGeUwEGvA5uPD6rz4YmffoaXZ_PLi-rifPrzI7yiefaFaYewtVqu8QheuIfV7f3yU1Dhv8Cg6D4 |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Source-Free+Progressive+Graph+Learning+for+Open-Set+Domain+Adaptation&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Luo%2C+Yadan&rft.au=Wang%2C+Zijian&rft.au=Chen%2C+Zhuoxiao&rft.au=Huang%2C+Zi&rft.date=2023-09-01&rft.issn=0162-8828&rft.eissn=2160-9292&rft.volume=45&rft.issue=9&rft.spage=11240&rft.epage=11255&rft_id=info:doi/10.1109%2FTPAMI.2023.3270288&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TPAMI_2023_3270288 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |