Topological Data Analysis in Graph Neural Networks: Surveys and Perspectives
For many years, topological data analysis (TDA) and deep learning (DL) have been considered separate data analysis and representation learning approaches, which have nothing in common. The root cause of this challenge comes from the difficulties in building, extracting, and integrating TDA construct...
Uložené v:
| Vydané v: | IEEE transaction on neural networks and learning systems Ročník 36; číslo 6; s. 9758 - 9776 |
|---|---|
| Hlavní autori: | , , , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
United States
IEEE
01.06.2025
|
| Predmet: | |
| ISSN: | 2162-237X, 2162-2388, 2162-2388 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | For many years, topological data analysis (TDA) and deep learning (DL) have been considered separate data analysis and representation learning approaches, which have nothing in common. The root cause of this challenge comes from the difficulties in building, extracting, and integrating TDA constructs, such as barcodes or persistent diagrams, within deep neural network architectures. Therefore, the powers of these two approaches are still on their islands and have not yet combined to form more powerful tools for dealing with multiple complex data analysis tasks. Fortunately, we have witnessed several remarkable attempts to integrate DL-based architectures with topological learning paradigms in recent years. These topology-driven DL techniques have notably improved data-driven analysis and mining problems, especially within graph datasets. Recently, graph neural networks (GNNs) have emerged as a popular deep neural architecture, demonstrating significant performance in various graph-based analysis and learning problems. Explicitly, within the manifold paradigm, the graph is naturally considered as a topological object (e.g., the topological properties of the given graph can be represented by the edge weights). Therefore, integrating TDA and GNN is considered an excellent combination. Many well-known studies have recently presented the effectiveness of TDA-assisted GNN-based architectures in dealing with complex graph-based data representation analysis and learning problems. Motivated by the successes of recent research, we present systematic literature about this nascent and promising research direction in this article, which includes general taxonomy, preliminaries, and recently proposed state-of-the-art topology-driven GNN models and perspectives. |
|---|---|
| AbstractList | For many years, topological data analysis (TDA) and deep learning (DL) have been considered separate data analysis and representation learning approaches, which have nothing in common. The root cause of this challenge comes from the difficulties in building, extracting, and integrating TDA constructs, such as barcodes or persistent diagrams, within deep neural network architectures. Therefore, the powers of these two approaches are still on their islands and have not yet combined to form more powerful tools for dealing with multiple complex data analysis tasks. Fortunately, we have witnessed several remarkable attempts to integrate DL-based architectures with topological learning paradigms in recent years. These topology-driven DL techniques have notably improved data-driven analysis and mining problems, especially within graph datasets. Recently, graph neural networks (GNNs) have emerged as a popular deep neural architecture, demonstrating significant performance in various graph-based analysis and learning problems. Explicitly, within the manifold paradigm, the graph is naturally considered as a topological object (e.g., the topological properties of the given graph can be represented by the edge weights). Therefore, integrating TDA and GNN is considered an excellent combination. Many well-known studies have recently presented the effectiveness of TDA-assisted GNN-based architectures in dealing with complex graph-based data representation analysis and learning problems. Motivated by the successes of recent research, we present systematic literature about this nascent and promising research direction in this article, which includes general taxonomy, preliminaries, and recently proposed state-of-the-art topology-driven GNN models and perspectives.For many years, topological data analysis (TDA) and deep learning (DL) have been considered separate data analysis and representation learning approaches, which have nothing in common. The root cause of this challenge comes from the difficulties in building, extracting, and integrating TDA constructs, such as barcodes or persistent diagrams, within deep neural network architectures. Therefore, the powers of these two approaches are still on their islands and have not yet combined to form more powerful tools for dealing with multiple complex data analysis tasks. Fortunately, we have witnessed several remarkable attempts to integrate DL-based architectures with topological learning paradigms in recent years. These topology-driven DL techniques have notably improved data-driven analysis and mining problems, especially within graph datasets. Recently, graph neural networks (GNNs) have emerged as a popular deep neural architecture, demonstrating significant performance in various graph-based analysis and learning problems. Explicitly, within the manifold paradigm, the graph is naturally considered as a topological object (e.g., the topological properties of the given graph can be represented by the edge weights). Therefore, integrating TDA and GNN is considered an excellent combination. Many well-known studies have recently presented the effectiveness of TDA-assisted GNN-based architectures in dealing with complex graph-based data representation analysis and learning problems. Motivated by the successes of recent research, we present systematic literature about this nascent and promising research direction in this article, which includes general taxonomy, preliminaries, and recently proposed state-of-the-art topology-driven GNN models and perspectives. For many years, topological data analysis (TDA) and deep learning (DL) have been considered separate data analysis and representation learning approaches, which have nothing in common. The root cause of this challenge comes from the difficulties in building, extracting, and integrating TDA constructs, such as barcodes or persistent diagrams, within deep neural network architectures. Therefore, the powers of these two approaches are still on their islands and have not yet combined to form more powerful tools for dealing with multiple complex data analysis tasks. Fortunately, we have witnessed several remarkable attempts to integrate DL-based architectures with topological learning paradigms in recent years. These topology-driven DL techniques have notably improved data-driven analysis and mining problems, especially within graph datasets. Recently, graph neural networks (GNNs) have emerged as a popular deep neural architecture, demonstrating significant performance in various graph-based analysis and learning problems. Explicitly, within the manifold paradigm, the graph is naturally considered as a topological object (e.g., the topological properties of the given graph can be represented by the edge weights). Therefore, integrating TDA and GNN is considered an excellent combination. Many well-known studies have recently presented the effectiveness of TDA-assisted GNN-based architectures in dealing with complex graph-based data representation analysis and learning problems. Motivated by the successes of recent research, we present systematic literature about this nascent and promising research direction in this article, which includes general taxonomy, preliminaries, and recently proposed state-of-the-art topology-driven GNN models and perspectives. |
| Author | Pham, Phu Vo, Bay Bui, Quang-Thinh Kozma, Robert Yu, Philip S. Thanh Nguyen, Ngoc |
| Author_xml | – sequence: 1 givenname: Phu orcidid: 0000-0002-8599-8126 surname: Pham fullname: Pham, Phu email: pta.phu@hutech.edu.vn organization: Faculty of Information Technology, HUTECH University, Ho Chi Minh City, Vietnam – sequence: 2 givenname: Quang-Thinh surname: Bui fullname: Bui, Quang-Thinh email: vd.bay@hutech.edu.vn organization: Faculty of Education and Basic Sciences, Tien Giang University, My Tho City, Vietnam – sequence: 3 givenname: Ngoc orcidid: 0000-0002-3247-2948 surname: Thanh Nguyen fullname: Thanh Nguyen, Ngoc email: ngoc-thanh.nguyen@pwr.edu.pl organization: Department of Applied Informatics, Wrocław University of Science and Technology, Wrocław, Poland – sequence: 4 givenname: Robert orcidid: 0000-0001-7011-5768 surname: Kozma fullname: Kozma, Robert email: rkozma@memphis.edu organization: Department of Mathematics, University of Memphis, Memphis, TN, USA – sequence: 5 givenname: Philip S. orcidid: 0000-0002-3491-5968 surname: Yu fullname: Yu, Philip S. email: psyu@uic.edu organization: Department of Computer Science, University of Illinois Chicago, Chicago, IL, USA – sequence: 6 givenname: Bay orcidid: 0000-0002-9246-4587 surname: Vo fullname: Vo, Bay email: vd.bay@hutech.edu.vn organization: Faculty of Information Technology, HUTECH University, Ho Chi Minh City, Vietnam |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/40030848$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kE1PwkAQhjcGI4j8AWNMj17A_Wp38UZQ0aRBEzDx1my3g66Wbt1tMfx7iyAxHpzLTDLPO5M8x6hV2AIQOiV4QAgeXs6n03g2oJjyAQspJlwcoA4lEe1TJmVrP4vnNup5_4abinAY8eERanOMGZZcdlA8t6XN7YvRKg-uVaWCUaHytTc-MEUwcap8DaZQu2Y7herTund_Fcxqt4K1D1SRBY_gfAm6MivwJ-hwoXIPvV3voqfbm_n4rh8_TO7Ho7ivGeVVf8FJpjgRTAHwNCVMRKlgmQ4zwlmGUxkKoTDXTAsSCsY5wWkkszQVYTTUVLEuutjeLZ39qMFXydJ4DXmuCrC1T1hzm2M2jHCDnu_QOl1ClpTOLJVbJz8GGoBuAe2s9w4We4TgZGM6-TadbEwnO9NNSP4JaVOpytiicsrk_0fPtlEDAL9-SRqFkrEv8kOKeg |
| CODEN | ITNNAL |
| CitedBy_id | crossref_primary_10_1016_j_knosys_2025_113125 crossref_primary_10_1002_minf_202400335 crossref_primary_10_3389_fimmu_2025_1615278 crossref_primary_10_1007_s40745_025_00633_9 crossref_primary_10_1016_j_compbiolchem_2025_108548 crossref_primary_10_1109_TASE_2025_3580658 crossref_primary_10_1002_for_70012 |
| Cites_doi | 10.1109/TNNLS.2021.3104901 10.1109/MSP.2017.2693418 10.1109/TPAMI.2020.3013679 10.1109/TCYB.2022.3233819 10.1111/j.1467-8659.2009.01515.x 10.3389/fgene.2021.690049 10.1561/2200000096 10.1109/TNNLS.2021.3060872 10.3389/fdata.2021.680535 10.48550/arXiv.1606.09375 10.1109/TKDE.2024.3374701 10.1609/aaai.v32i1.11782 10.1090/jams/852 10.1109/TKDE.2020.3045924 10.1016/j.ddtec.2020.11.009 10.24963/ijcai.2019/267 10.1109/TNNLS.2020.2978386 10.1007/978-3-030-43408-3_5 10.1016/j.jbi.2022.104082 10.1609/aaai.v37i6.25866 10.1109/TKDE.2022.3148299 10.1016/j.cag.2010.03.007 10.1145/3535101 10.48550/ARXIV.1706.03762 10.1073/pnas.2019994118 10.1016/j.eswa.2020.113790 10.1007/s00454-004-1146-y 10.1145/3447548.3467442 10.1145/3234150 10.1109/TNNLS.2021.3070843 10.1007/978-3-030-43036-8_6 10.1109/TNNLS.2021.3137396 10.1007/s10462-022-10265-7 10.1140/epjds/s13688-017-0109-5 10.3389/frai.2021.681108 10.1609/aaai.v38i13.29328 10.1145/3326362 10.1016/j.eswa.2022.117921 10.1016/j.aiopen.2021.01.001 10.1016/j.ins.2022.11.085 10.1007/s10462-024-10710-9 10.1109/TKDE.2021.3101356 10.1145/3565973 10.1145/3292500.3330982 10.1090/s0273-0979-09-01249-x 10.1038/s42256-019-0087-3 10.1007/s10462-022-10146-z 10.1109/TKDE.2016.2598561 |
| ContentType | Journal Article |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7X8 |
| DOI | 10.1109/TNNLS.2024.3520147 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE/IET Electronic Library (IEL) (UW System Shared) CrossRef PubMed MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic PubMed |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 2162-2388 |
| EndPage | 9776 |
| ExternalDocumentID | 40030848 10_1109_TNNLS_2024_3520147 10826583 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: Vietnam National Foundation for Science and Technology Development (NAFOSTED) grantid: 102.05-2021.08 funderid: 10.13039/100007224 |
| GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION NPM RIG 7X8 |
| ID | FETCH-LOGICAL-c324t-f41da4173aee4bb1376b73dc5d143d0b8577a04c3c715734410b68dbb7569c2a3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 12 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001395152400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2162-237X 2162-2388 |
| IngestDate | Sun Nov 09 12:15:53 EST 2025 Mon Jul 21 05:30:43 EDT 2025 Tue Nov 18 22:27:46 EST 2025 Sat Nov 29 07:48:17 EST 2025 Wed Aug 27 01:52:21 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | true |
| Issue | 6 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c324t-f41da4173aee4bb1376b73dc5d143d0b8577a04c3c715734410b68dbb7569c2a3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| ORCID | 0000-0001-7011-5768 0000-0002-8599-8126 0000-0002-3247-2948 0000-0002-9246-4587 0000-0002-3491-5968 |
| PMID | 40030848 |
| PQID | 3173403960 |
| PQPubID | 23479 |
| PageCount | 19 |
| ParticipantIDs | crossref_citationtrail_10_1109_TNNLS_2024_3520147 crossref_primary_10_1109_TNNLS_2024_3520147 proquest_miscellaneous_3173403960 ieee_primary_10826583 pubmed_primary_40030848 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-06-01 |
| PublicationDateYYYYMMDD | 2025-06-01 |
| PublicationDate_xml | – month: 06 year: 2025 text: 2025-06-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States |
| PublicationTitle | IEEE transaction on neural networks and learning systems |
| PublicationTitleAbbrev | TNNLS |
| PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
| PublicationYear | 2025 |
| Publisher | IEEE |
| Publisher_xml | – name: IEEE |
| References | ref57 Zhao (ref62); 34 ref52 ref54 Veličković (ref16) Carrière (ref97) Topping (ref68) ref45 ref48 ref47 ref42 ref41 ref44 ref43 Immonen (ref82) Zhang (ref22) Sun (ref46) Zhang (ref86) ref49 Bianchi (ref55) ref8 ref7 ref9 Moor (ref29) ref4 ref3 ref6 Akçora (ref80) ref5 Togninalli (ref65) 2019; 32 ref40 ref35 ref34 ref37 ref36 Zhang (ref17) ref30 ref33 Gao (ref21) Vandaele (ref32) 2020; 21 Kipf (ref14) ref39 ref38 Wijesinghe (ref64) Carrière (ref75) Zhao (ref83) ref26 ref25 ref28 Zhu (ref23) ref27 Hofer (ref73); 1 Chami (ref19) Xu (ref18) ref13 ref12 Wen (ref88) Yang (ref50) Horn (ref74) ref96 ref11 Maron (ref66); 32 ref10 Ying (ref51) Lee (ref53) Solomon (ref98) Birdal (ref31) Zaheer (ref79) Balcılar (ref58) You (ref99) Yang (ref59) ref90 ref89 Gasteiger (ref60) Trofimov (ref92) ref81 ref84 de Surrel (ref94) Kim (ref95); 33 Williams (ref93) Chen (ref78) ref77 ref76 Songdechakraiwut (ref87) ref2 ref1 Bodnar (ref85) Wang (ref56); 1 ref70 Barannikov (ref91) Weisfeiler (ref63) 1968; 2 Yun (ref20); 32 ref67 ref69 Zhang (ref24) Hofer (ref71) Hofer (ref72) 2019; 20 Hamilton (ref15) ref61 |
| References_xml | – ident: ref25 doi: 10.1109/TNNLS.2021.3104901 – ident: ref13 doi: 10.1109/MSP.2017.2693418 – ident: ref96 doi: 10.1109/TPAMI.2020.3013679 – ident: ref45 doi: 10.1109/TCYB.2022.3233819 – ident: ref90 doi: 10.1111/j.1467-8659.2009.01515.x – ident: ref9 doi: 10.3389/fgene.2021.690049 – start-page: 874 volume-title: Proc. ICML ident: ref55 article-title: Spectral clustering with graph neural networks for graph pooling – volume-title: Proc. NeurIPS ident: ref87 article-title: Scalable vector representation for topological data analysis based classification – volume-title: Proc. ICLR ident: ref22 article-title: Capsule graph neural network – volume-title: Proc. ICML ident: ref29 article-title: Topological autoencoders – start-page: 27029 volume-title: Proc. NeurIPS ident: ref78 article-title: Topological relational learning on graphs – ident: ref39 doi: 10.1561/2200000096 – volume: 1 start-page: 9952 volume-title: Proc. ICML ident: ref56 article-title: Haar graph pooling – ident: ref4 doi: 10.1109/TNNLS.2021.3060872 – ident: ref84 doi: 10.3389/fdata.2021.680535 – ident: ref57 doi: 10.48550/arXiv.1606.09375 – ident: ref42 doi: 10.1109/TKDE.2024.3374701 – volume: 32 start-page: 2153 volume-title: Proc. NeurIPS ident: ref66 article-title: Provably powerful graph networks – ident: ref52 doi: 10.1609/aaai.v32i1.11782 – volume-title: Proc. ICLR ident: ref14 article-title: Semi-supervised classification with graph convolutional networks – start-page: 3734 volume-title: Proc. ICML ident: ref53 article-title: Self-attention graph pooling – volume-title: Proc. ICLR ident: ref74 article-title: Topological graph neural networks – ident: ref27 doi: 10.1090/jams/852 – start-page: 1294 volume-title: Proc. ICML ident: ref97 article-title: Optimizing persistent homology based functions – volume: 1 start-page: 4314 volume-title: Proc. ICML ident: ref73 article-title: Graph filtration learning – ident: ref48 doi: 10.1109/TKDE.2020.3045924 – ident: ref10 doi: 10.1016/j.ddtec.2020.11.009 – start-page: 16962 volume-title: Proc. NeurIPS ident: ref50 article-title: Self-supervised heterogeneous graph pre-training based on structural clustering – ident: ref61 doi: 10.24963/ijcai.2019/267 – start-page: 2786 volume-title: Proc. AISTATS ident: ref75 article-title: PersLay: A neural network layer for persistence diagrams and new graph topological signatures – volume-title: Proc. NeurIPS ident: ref51 article-title: Hierarchical graph representation learning with differentiable pooling – ident: ref1 doi: 10.1109/TNNLS.2020.2978386 – ident: ref34 doi: 10.1007/978-3-030-43408-3_5 – ident: ref38 doi: 10.1016/j.jbi.2022.104082 – start-page: 96 volume-title: Proc. Topological, Algebraic Geometric Learn. Workshops ident: ref94 article-title: RipsNet: A general architecture for fast and robust estimation of the persistent homology of point clouds – ident: ref81 doi: 10.1609/aaai.v37i6.25866 – ident: ref41 doi: 10.1109/TKDE.2022.3148299 – ident: ref69 doi: 10.1016/j.cag.2010.03.007 – ident: ref5 doi: 10.1145/3535101 – volume-title: Proc. ICLR ident: ref92 article-title: Learning topology-preserving data representations – start-page: 2083 volume-title: Proc. ICML ident: ref21 article-title: Graph U-Nets – ident: ref67 doi: 10.48550/ARXIV.1706.03762 – volume-title: Proc. NeurIPS ident: ref79 article-title: Deep sets – ident: ref30 doi: 10.1073/pnas.2019994118 – ident: ref6 doi: 10.1016/j.eswa.2020.113790 – volume: 33 start-page: 15965 volume-title: Proc. NeurIPS ident: ref95 article-title: PLLay: Efficient topological layer based on persistent landscapes – volume-title: Proc. ICML ident: ref23 article-title: Deep graph contrastive representation learning – volume: 32 start-page: 11960 volume-title: Proc. NeurIPS ident: ref20 article-title: Graph transformer networks – ident: ref33 doi: 10.1007/s00454-004-1146-y – start-page: 4602 volume-title: Proc. NeurIPS ident: ref99 article-title: Graph contrastive learning with augmentations – volume-title: Proc. NeurIPS ident: ref82 article-title: Going beyond persistent homology using persistent homology – start-page: 16:1 volume-title: Proc. LoG ident: ref86 article-title: GEFL: Extended filtration learning for graph classification – start-page: 1 volume-title: Proc. ICLR ident: ref64 article-title: A new perspective on ’how graph neural networks go beyond Weisfeiler–Lehman? – volume-title: Proc. ICLR ident: ref18 article-title: How powerful are graph neural networks – ident: ref76 doi: 10.1145/3447548.3467442 – volume: 21 start-page: 1 issue: 215 year: 2020 ident: ref32 article-title: Mining topological structure in graphs through forest representations publication-title: J. Mach. Learn. Res. – start-page: 6776 volume-title: Proc. NeurIPS ident: ref31 article-title: Intrinsic dimension, persistent homology and generalization in neural networks – start-page: 1607 volume-title: Proc. ICML ident: ref91 article-title: Representation topology divergence: A method for comparing neural network representations – start-page: 4738 volume-title: Proc. NeurIPS ident: ref93 article-title: Generalized shape metrics on neural representations – volume: 20 start-page: 1 issue: 126 year: 2019 ident: ref72 article-title: Learning representations of persistence barcodes publication-title: J. Mach. Learn. Res. – ident: ref11 doi: 10.1145/3234150 – start-page: 109 volume-title: Proc. AISTATS ident: ref98 article-title: A fast and robust method for global topological functional optimization – volume-title: Proc. ICLR ident: ref16 article-title: Graph attention networks – start-page: 25046 volume-title: Proc. NeurIPS ident: ref80 article-title: Reduction algorithms for persistence diagrams of networks: CoralTDA and PrunIT – start-page: 1026 volume-title: Proc. ICML ident: ref85 article-title: Weisfeiler and Lehman go topological: Message passing simplicial networks – volume: 34 start-page: 23321 volume-title: Proc. NeurIPS ident: ref62 article-title: Adaptive diffusion in graph neural networks – start-page: 4330 volume-title: Proc. AISTATS ident: ref88 article-title: Tensor-view topological graph neural network – ident: ref7 doi: 10.1109/TNNLS.2021.3070843 – volume-title: Proc. NeurIPS ident: ref15 article-title: Inductive representation learning on large graphs – ident: ref77 doi: 10.1007/978-3-030-43036-8_6 – volume-title: Proc. NeurIPS ident: ref71 article-title: Deep learning with topological signatures – ident: ref3 doi: 10.1109/TNNLS.2021.3137396 – volume-title: Proc. ICLR ident: ref68 article-title: Understanding over-squashing and bottlenecks on graphs via curvature – ident: ref12 doi: 10.1007/s10462-022-10265-7 – start-page: 25261 volume-title: Proc. ICML ident: ref59 article-title: A new perspective on the effects of spectrum in graph neural networks – ident: ref70 doi: 10.1140/epjds/s13688-017-0109-5 – ident: ref35 doi: 10.3389/frai.2021.681108 – volume-title: Proc. NeurIPS ident: ref60 article-title: Diffusion improves graph learning – ident: ref43 doi: 10.1609/aaai.v38i13.29328 – ident: ref89 doi: 10.1145/3326362 – volume-title: Proc. NeurIPS ident: ref17 article-title: Link prediction based on graph neural networks – volume-title: Proc. NeurIPS ident: ref19 article-title: Hyperbolic graph convolutional neural networks – ident: ref8 doi: 10.1016/j.eswa.2022.117921 – ident: ref2 doi: 10.1016/j.aiopen.2021.01.001 – ident: ref44 doi: 10.1016/j.ins.2022.11.085 – start-page: 12096 volume-title: Proc. NeurIPS ident: ref46 article-title: Does GNN pretraining help molecular representation? – volume: 32 start-page: 6439 year: 2019 ident: ref65 article-title: Wasserstein Weisfeiler–Lehman graph kernels publication-title: NeurIPS – ident: ref36 doi: 10.1007/s10462-024-10710-9 – volume: 2 start-page: 12 issue: 9 year: 1968 ident: ref63 article-title: The reduction of a graph to canonical form and the algebra which appears therein publication-title: Nauchno-Technicheskaja Informatsia – volume-title: Proc. ICLR ident: ref58 article-title: Analyzing the expressive power of graph neural networks in a spectral perspective – ident: ref49 doi: 10.1109/TKDE.2021.3101356 – ident: ref37 doi: 10.1145/3565973 – ident: ref54 doi: 10.1145/3292500.3330982 – ident: ref26 doi: 10.1090/s0273-0979-09-01249-x – ident: ref28 doi: 10.1038/s42256-019-0087-3 – ident: ref40 doi: 10.1007/s10462-022-10146-z – ident: ref47 doi: 10.1109/TKDE.2016.2598561 – start-page: 2896 volume-title: Proc. AISTATS ident: ref83 article-title: Persistence enhanced graph neural network – volume-title: Proc. ICLR ident: ref24 article-title: Graph-less neural networks: Teaching old MLPs new tricks via distillation |
| SSID | ssj0000605649 |
| Score | 2.5590553 |
| Snippet | For many years, topological data analysis (TDA) and deep learning (DL) have been considered separate data analysis and representation learning approaches,... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 9758 |
| SubjectTerms | Data analysis Data mining Data structures Deep learning (DL) Electronic mail Feature extraction graph filtration graph neural network (GNN) Graph neural networks Learning systems persistent homology (PH) Representation learning Surveys Taxonomy topological data analysis (TDA) topological representation learning |
| Title | Topological Data Analysis in Graph Neural Networks: Surveys and Perspectives |
| URI | https://ieeexplore.ieee.org/document/10826583 https://www.ncbi.nlm.nih.gov/pubmed/40030848 https://www.proquest.com/docview/3173403960 |
| Volume | 36 |
| WOSCitedRecordID | wos001395152400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2162-2388 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000605649 issn: 2162-237X databaseCode: RIE dateStart: 20120101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5UPHhxfbu-iOBNqmmTNo038XlYiuAqeyt5FQTpyj78_U7SdvWi4K3QpJR8M8k3SWY-gLNKZi7LLY9sJl3EM82iXCc8kpWVVqN35aGW3utAFEU-GsmnNlk95MI458LlM3fhH8NZvh2bud8qQw9HMpzmbBmWhRBNstZiQ4UiMc8C3U3iLIkSJkZdkgyVl8OiGDxjOJjwC6QcGBd49T0eqrV46Z8fa1IQWfmdb4Z15773zz_egPWWYJLrxiI2YcnVW9DrxBtI68vbMBg28ggeJHKrZop09UnIW00efB1r4it34NuiuSo-vSLP88knQk9UbcnTd5rmdAde7u-GN49RK60QGWRQs6jisVU8Fkw5x7WOcZrRglmTWuRPluo8FUJRbpgRcSoYciaqEVKtRZpJkyi2Cyv1uHb7QKhTSrlKSFNRbmmsMyaq2HJbaWQrzPYh7ga3NG3dcS9_8V6G-IPKMmBTemzKFps-nC_6fDRVN_5sveNH_kfLZtD7cNqBWKLT-JMQVbvxfFoiaWKcMoze-rDXoLvo3RnFwS9fPYS1xGsAh52YI1iZTebuGFbN5-xtOjlByxzlJ8EyvwDh4NwK |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NT9wwEB0VWoleSj-gLKXUlXqrAnbsxHFvCEpBTSMkttXeIn9FWgll0X7w-zt2koULSNwiJY4iP0_8xva8B_CtUbnPCycSlyufiNzwpDCpSFTjlDMYXUXU0vtXyqoqJhN11Rerx1oY7308fOaPwmXcy3czuwpLZRjhSIazgm_Ay0yIlHXlWuslFYrUPI-EN2V5mqRcToYyGaqOx1VVXmNCmIojJB2YGQT_PRH1WoL5z4NZKdqsPM4448xzvv3Mb34Lb3qKSU66MfEOXvj2PWwP9g2kj-YPUI47g4QAEznTS00GhRIybcmvoGRNgnYH3q26w-KLH-R6Nb9D8IluHbm6L9Rc7MDf85_j04ukN1dILHKoZdII5rRgkmvvhTEMfzRGcmczhwzKUVNkUmoqLLeSZZIja6IGQTVGZrmyqea7sNnOWr8HhHqttW-ksg0VjjKTc9kwJ1xjkK9wNwI2dG5te-XxYIBxU8cMhKo6YlMHbOoemxF8X7e57XQ3nnx6J_T8gye7Th_B1wHEGsMm7IXo1s9WixppExeUY_42go8duuvWw6DYf-StX2DrYvynrMvL6vcneJ0GR-C4LnMAm8v5yn-GV_ZuOV3MD-P4_A9Nkt5p |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Topological+Data+Analysis+in+Graph+Neural+Networks%3A+Surveys+and+Perspectives&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Pham%2C+Phu&rft.au=Bui%2C+Quang-Thinh&rft.au=Thanh+Nguyen%2C+Ngoc&rft.au=Kozma%2C+Robert&rft.date=2025-06-01&rft.issn=2162-237X&rft.eissn=2162-2388&rft.volume=36&rft.issue=6&rft.spage=9758&rft.epage=9776&rft_id=info:doi/10.1109%2FTNNLS.2024.3520147&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TNNLS_2024_3520147 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |