Cauchy activation function and XNet
We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex analysis and is specifically tailored for problems requiring high precision. This innovation has led to the creation of a new class of neural netw...
Saved in:
| Published in: | Neural networks Vol. 188; p. 107375 |
|---|---|
| Main Authors: | , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
United States
Elsevier Ltd
01.08.2025
|
| Subjects: | |
| ISSN: | 0893-6080, 1879-2782, 1879-2782 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex analysis and is specifically tailored for problems requiring high precision. This innovation has led to the creation of a new class of neural networks, which we call (Comple)XNet, or simply XNet.
We will demonstrate that XNet is particularly effective for high-dimensional challenges such as image classification and solving Partial Differential Equations (PDEs). Our evaluations show that XNet significantly outperforms established benchmarks like MNIST and CIFAR-10 in computer vision, and offers substantial advantages over Physics-Informed Neural Networks (PINNs) in both low-dimensional and high-dimensional PDE scenarios. |
|---|---|
| AbstractList | We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex analysis and is specifically tailored for problems requiring high precision. This innovation has led to the creation of a new class of neural networks, which we call (Comple)XNet, or simply XNet.
We will demonstrate that XNet is particularly effective for high-dimensional challenges such as image classification and solving Partial Differential Equations (PDEs). Our evaluations show that XNet significantly outperforms established benchmarks like MNIST and CIFAR-10 in computer vision, and offers substantial advantages over Physics-Informed Neural Networks (PINNs) in both low-dimensional and high-dimensional PDE scenarios. We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex analysis and is specifically tailored for problems requiring high precision. This innovation has led to the creation of a new class of neural networks, which we call (Comple)XNet, or simply XNet. We will demonstrate that XNet is particularly effective for high-dimensional challenges such as image classification and solving Partial Differential Equations (PDEs). Our evaluations show that XNet significantly outperforms established benchmarks like MNIST and CIFAR-10 in computer vision, and offers substantial advantages over Physics-Informed Neural Networks (PINNs) in both low-dimensional and high-dimensional PDE scenarios.We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex analysis and is specifically tailored for problems requiring high precision. This innovation has led to the creation of a new class of neural networks, which we call (Comple)XNet, or simply XNet. We will demonstrate that XNet is particularly effective for high-dimensional challenges such as image classification and solving Partial Differential Equations (PDEs). Our evaluations show that XNet significantly outperforms established benchmarks like MNIST and CIFAR-10 in computer vision, and offers substantial advantages over Physics-Informed Neural Networks (PINNs) in both low-dimensional and high-dimensional PDE scenarios. We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex analysis and is specifically tailored for problems requiring high precision. This innovation has led to the creation of a new class of neural networks, which we call (Comple)XNet, or simply XNet. We will demonstrate that XNet is particularly effective for high-dimensional challenges such as image classification and solving Partial Differential Equations (PDEs). Our evaluations show that XNet significantly outperforms established benchmarks like MNIST and CIFAR-10 in computer vision, and offers substantial advantages over Physics-Informed Neural Networks (PINNs) in both low-dimensional and high-dimensional PDE scenarios. |
| ArticleNumber | 107375 |
| Author | Zhang, Hongkun Li, Xin Xia, Zhihong |
| Author_xml | – sequence: 1 givenname: Xin surname: Li fullname: Li, Xin email: xinli2023@u.northwestern.edu organization: Department of Computer Science, Northwestern University, Evanston, IL, USA – sequence: 2 givenname: Zhihong surname: Xia fullname: Xia, Zhihong email: xia@math.northwestern.edu organization: School of Natural Science, Great Bay University, Guangdong, China – sequence: 3 givenname: Hongkun surname: Zhang fullname: Zhang, Hongkun email: hzhang@umass.edu organization: School of Natural Science, Great Bay University, Guangdong, China |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/40157236$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kM1LAzEQxYMo9kP_A5GCFy9bk0k2m70IUvyCohcFbyFNZjGlzdZkt9D_3q1bPXqaYfi9x7w3IsehDkjIBaNTRpm8WU4DtgGbKVDIu1PBi_yIDJkqygwKBcdkSFXJM0kVHZBRSktKqVSCn5KBoCwvgMshuZqZ1n7uJsY2fmsaX4dJ1Qb7s5jgJh8v2JyRk8qsEp4f5pi8P9y_zZ6y-evj8-xunlnOeJMBL1HxnLOKoVioyjiocpCwKLgDWkjnnCmZFdZQYNZwqgCcklDmApREwcfkuvfdxPqrxdTotU8WVysTsG6T5kx1j5c8Vx16eUDbxRqd3kS_NnGnf4N1gOgBG-uUIlZ_CKN6359e6r4_ve9P9_11sttehl3Orceok_UYLDof0Tba1f5_g28tyXdO |
| Cites_doi | 10.1016/j.neucom.2022.06.111 10.1007/s12065-024-00908-9 10.1038/s41586-019-1923-7 10.1073/pnas.1718942115 10.1109/ACCESS.2022.3232064 10.1016/j.jcp.2019.109136 10.1038/nature14539 10.1016/j.jcp.2018.10.045 10.4208/cicp.OA-2020-0164 10.1038/s42256-021-00302-5 10.1016/j.dsp.2022.103812 10.1038/s41586-019-0912-1 10.1038/s41586-021-03819-2 10.1016/j.neunet.2017.07.002 10.1109/JAS.2022.105743 10.1038/323533a0 10.1016/j.jcp.2018.08.029 10.1016/j.neunet.2022.01.001 |
| ContentType | Journal Article |
| Copyright | 2025 Elsevier Ltd Copyright © 2025 Elsevier Ltd. All rights reserved. |
| Copyright_xml | – notice: 2025 Elsevier Ltd – notice: Copyright © 2025 Elsevier Ltd. All rights reserved. |
| DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 7X8 |
| DOI | 10.1016/j.neunet.2025.107375 |
| DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic MEDLINE |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 1879-2782 |
| ExternalDocumentID | 40157236 10_1016_j_neunet_2025_107375 S0893608025002540 |
| Genre | Journal Article |
| GroupedDBID | --- --K --M -~X .DC .~1 0R~ 123 186 1B1 1RT 1~. 1~5 29N 4.4 457 4G. 53G 5RE 5VS 6TJ 7-5 71M 8P~ 9JM 9JN AABNK AAEDT AAEDW AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AATTM AAXKI AAXLA AAXUO AAYFN AAYWO ABAOU ABBOA ABCQJ ABDPE ABEFU ABFNM ABFRF ABHFT ABIVO ABJNI ABLJU ABMAC ABWVN ABXDB ACDAQ ACGFO ACGFS ACIUM ACNNM ACRLP ACRPL ACVFH ACZNC ADBBV ADCNI ADEZE ADGUI ADJOM ADMUD ADNMO ADRHT AEBSH AECPX AEFWE AEIPS AEKER AENEX AEUPX AFJKZ AFPUW AFTJW AFXIZ AGCQF AGHFR AGQPQ AGRNS AGUBO AGWIK AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIGII AIIUN AIKHN AITUG AKBMS AKRWK AKYEP ALMA_UNASSIGNED_HOLDINGS AMRAJ ANKPU AOUOD APXCP ARUGR ASPBG AVWKF AXJTR AZFZN BJAXD BKOJK BLXMC BNPGV CS3 DU5 EBS EFJIC EJD EO8 EO9 EP2 EP3 F0J F5P FDB FEDTE FGOYB FIRID FNPLU FYGXN G-2 G-Q GBLVA GBOLZ HLZ HMQ HVGLF HZ~ IHE J1W JJJVA K-O KOM KZ1 LG9 LMP M2V M41 MHUIS MO0 MOBAO MVM N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. Q38 R2- RIG ROL RPZ SBC SCC SDF SDG SDP SES SEW SNS SPC SPCBC SSH SSN SST SSV SSW SSZ T5K TAE UAP UNMZH VOH WUQ XPP ZMT ~G- 9DU AAYXX ACLOT CITATION EFKBS EFLBG ~HD CGR CUY CVF ECM EIF NPM 7X8 |
| ID | FETCH-LOGICAL-c313t-239e83531f1e4b8fad2f5262b73d2076ddda91c4ca021ca30822d862954286e43 |
| ISICitedReferencesCount | 1 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001460990100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0893-6080 1879-2782 |
| IngestDate | Fri Oct 03 00:37:49 EDT 2025 Mon Jun 02 02:21:45 EDT 2025 Sat Nov 29 07:55:06 EST 2025 Sat Jun 28 18:16:28 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | Physics-Informed Neural Networks Cauchy Integral Theorem Image classification |
| Language | English |
| License | Copyright © 2025 Elsevier Ltd. All rights reserved. |
| LinkModel | OpenURL |
| MergedId | FETCHMERGED-LOGICAL-c313t-239e83531f1e4b8fad2f5262b73d2076ddda91c4ca021ca30822d862954286e43 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| PMID | 40157236 |
| PQID | 3184019358 |
| PQPubID | 23479 |
| ParticipantIDs | proquest_miscellaneous_3184019358 pubmed_primary_40157236 crossref_primary_10_1016_j_neunet_2025_107375 elsevier_sciencedirect_doi_10_1016_j_neunet_2025_107375 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-08-01 |
| PublicationDateYYYYMMDD | 2025-08-01 |
| PublicationDate_xml | – month: 08 year: 2025 text: 2025-08-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States |
| PublicationTitle | Neural networks |
| PublicationTitleAlternate | Neural Netw |
| PublicationYear | 2025 |
| Publisher | Elsevier Ltd |
| Publisher_xml | – name: Elsevier Ltd |
| References | Jagtap, Karniadakis (b13) 2020; 28 Jarrett, Kavukcuoglu, Ranzato, LeCun (b17) 2009 Begion, Y., et al. (2018). Deep Complex Networks. In Wu, Luo, Wang, Wang, Long (b36) 2024 E, Wang (b7) 2018; 16 Kunc, Kléma (b23) 2024 Yarotsky (b38) 2017; 94 Zhang, Li, Xia (b40) 2024 Fu, Hamilton, Brandt, Feldman, Zhang, Freeman (b9) 2024 Reichstein, Camps-Valls, Stevens, Jung, Denzler, Carvalhais (b31) 2019; 566 Sirignano, Spiliopoulos (b34) 2018; 375 Hirose (b12) 2012 Lee, Hasegawa, Gao (b26) 2022; 9 Raissi, Perdikaris, Karniadakis (b29) 2019; 378 Jagtap, Kharazmi, Karniadakis (b16) 2022; 92 Ramachandran, Zoph, Le (b30) 2017 Jagtap, Kawaguchi, Karniadakis (b15) 2020; 404 Li, Xia, Zheng (b27) 2024 Emmerich (b8) 2003 Senior, Evans, Jumper, Kirkpatrick, Sifre, Green (b33) 2020; 577 Rumelhart, Hinton, Williams (b32) 1986; 323 Jumper, Evans, Pritzel, Green, Figurnov, Tunyasuvunakool (b19) 2021; 596 Nair, V., & Hinton, G. E. (2010). Rectified linear units improve restricted boltzmann machines. In Jin, Lu, Pang, Zhang, Karniadakis (b18) 2021; 3 Wu, Xu, Dai, Wan, Zhang, Yan (b37) 2020 Kalim, Chug, Singh (b20) 2024; 17 Jagtap, Karniadakis (b14) 2022 Arjovsky, Shah, Bengio (b1) 2016 Zhao, Ding, Aditya Prakash (b41) 2023 Boullé, Nakatsukasa, Townsend (b5) 2020; 33 Dubey, Singh, Chaudhuri (b6) 2022; 503 Han, Jentzen, Weinan (b11) 2018; 115 Tripathi, Tiwari, Dhawan, Sharma, Jha (b35) 2021 Knezevic, Fulir, Jakobovic, Picek, Durasevic (b22) 2023; 11 Poster session. Goodfellow, Bengio, Courville (b10) 2016 LeCun, Bengio, Hinton (b25) 2015; 521 Bingham, Miikkulainen (b4) 2022; 148 Le, Rathour, Yamazaki, Luu, Savvides (b24) 2022 Yeats, Chen, Li (b39) 2021 Barrachina, Ren, Vieillard, Morisseau, Ovarlez (b2) 2023 Kaur, Singh (b21) 2023; 132 (pp. 807–814). Hirose (10.1016/j.neunet.2025.107375_b12) 2012 Arjovsky (10.1016/j.neunet.2025.107375_b1) 2016 Dubey (10.1016/j.neunet.2025.107375_b6) 2022; 503 Jin (10.1016/j.neunet.2025.107375_b18) 2021; 3 Emmerich (10.1016/j.neunet.2025.107375_b8) 2003 Jumper (10.1016/j.neunet.2025.107375_b19) 2021; 596 Yeats (10.1016/j.neunet.2025.107375_b39) 2021 Jagtap (10.1016/j.neunet.2025.107375_b14) 2022 Kalim (10.1016/j.neunet.2025.107375_b20) 2024; 17 Barrachina (10.1016/j.neunet.2025.107375_b2) 2023 Ramachandran (10.1016/j.neunet.2025.107375_b30) 2017 Fu (10.1016/j.neunet.2025.107375_b9) 2024 Kunc (10.1016/j.neunet.2025.107375_b23) 2024 Raissi (10.1016/j.neunet.2025.107375_b29) 2019; 378 Sirignano (10.1016/j.neunet.2025.107375_b34) 2018; 375 Jarrett (10.1016/j.neunet.2025.107375_b17) 2009 Tripathi (10.1016/j.neunet.2025.107375_b35) 2021 Jagtap (10.1016/j.neunet.2025.107375_b13) 2020; 28 Jagtap (10.1016/j.neunet.2025.107375_b15) 2020; 404 Reichstein (10.1016/j.neunet.2025.107375_b31) 2019; 566 Senior (10.1016/j.neunet.2025.107375_b33) 2020; 577 10.1016/j.neunet.2025.107375_b28 10.1016/j.neunet.2025.107375_b3 Wu (10.1016/j.neunet.2025.107375_b36) 2024 Knezevic (10.1016/j.neunet.2025.107375_b22) 2023; 11 Le (10.1016/j.neunet.2025.107375_b24) 2022 Rumelhart (10.1016/j.neunet.2025.107375_b32) 1986; 323 Han (10.1016/j.neunet.2025.107375_b11) 2018; 115 Boullé (10.1016/j.neunet.2025.107375_b5) 2020; 33 Zhang (10.1016/j.neunet.2025.107375_b40) 2024 E (10.1016/j.neunet.2025.107375_b7) 2018; 16 LeCun (10.1016/j.neunet.2025.107375_b25) 2015; 521 Bingham (10.1016/j.neunet.2025.107375_b4) 2022; 148 Li (10.1016/j.neunet.2025.107375_b27) 2024 Wu (10.1016/j.neunet.2025.107375_b37) 2020 Zhao (10.1016/j.neunet.2025.107375_b41) 2023 Yarotsky (10.1016/j.neunet.2025.107375_b38) 2017; 94 Lee (10.1016/j.neunet.2025.107375_b26) 2022; 9 Kaur (10.1016/j.neunet.2025.107375_b21) 2023; 132 Goodfellow (10.1016/j.neunet.2025.107375_b10) 2016 Jagtap (10.1016/j.neunet.2025.107375_b16) 2022; 92 |
| References_xml | – volume: 404 year: 2020 ident: b15 article-title: Adaptive activation functions accelerate convergence in deep and physics-informed neural networks publication-title: Journal of Computational Physics – start-page: 11953 year: 2021 end-page: 11963 ident: b39 article-title: Improving gradient regularization using complex-valued neural networks publication-title: Proceedings of the 38th international conference on machine learning, volume 139 – volume: 115 start-page: 8505 year: 2018 end-page: 8510 ident: b11 article-title: Solving high-dimensional partial differential equations using deep learning publication-title: Proceedings of the National Academy of Sciences – year: 2024 ident: b40 article-title: Cauchynet: Utilizing complex activation functions for enhanced time-series forecasting and data imputation – volume: 11 start-page: 284 year: 2023 end-page: 299 ident: b22 article-title: NeuroSCA: Evolving activation functions for side-channel analysis publication-title: IEEE Access – volume: 596 start-page: 583 year: 2021 end-page: 589 ident: b19 article-title: Highly accurate protein structure prediction with AlphaFold publication-title: Nature – volume: 323 start-page: 533 year: 1986 end-page: 536 ident: b32 article-title: Learning representations by back-propagating errors publication-title: Nature – year: 2016 ident: b10 article-title: Deep learning – year: 2023 ident: b2 article-title: Theory and implementation of complex-valued neural networks – reference: . Poster session. – reference: Nair, V., & Hinton, G. E. (2010). Rectified linear units improve restricted boltzmann machines. In – year: 2021 ident: b35 article-title: A survey on efficient realization of activation functions of artificial neural network publication-title: 2021 Asian conference on innovation in technology – start-page: 2146 year: 2009 end-page: 2153 ident: b17 article-title: What is the best multi-stage architecture for object recognition? publication-title: Proceedings of the IEEE international conference on computer vision – year: 2024 ident: b36 article-title: Transolver: A fast transformer solver for PDEs on general geometries – volume: 521 start-page: 436 year: 2015 end-page: 444 ident: b25 article-title: Deep learning publication-title: Nature – volume: 33 start-page: 14243 year: 2020 end-page: 14253 ident: b5 article-title: Rational neural networks publication-title: Advances in Neural Information Processing Systems – volume: 92 start-page: 80 year: 2022 ident: b16 article-title: Locally adaptive activation functions with applications to deep and physics-informed neural networks publication-title: Journal of Scientific Computing – volume: 132 year: 2023 ident: b21 article-title: A comprehensive review of object detection with deep learning publication-title: Digital Signal Processing – reference: (pp. 807–814). – volume: 16 start-page: 2349 year: 2018 end-page: 2383 ident: b7 article-title: A priori estimates and PochHammer-Chree expansions for deep neural networks publication-title: Communications in Mathematical Sciences – start-page: 1120 year: 2016 end-page: 1128 ident: b1 article-title: Unitary evolution recurrent neural networks publication-title: Proceedings of the 33rd international conference on machine learning - volume 48 – volume: 28 start-page: 2002 year: 2020 end-page: 2041 ident: b13 article-title: Extended physics-informed neural networks (XPINNs): A generalized space-time domain decomposition based deep learning framework for nonlinear partial differential equations publication-title: Communications in Computational Physics – volume: 566 start-page: 195 year: 2019 end-page: 204 ident: b31 article-title: Deep learning and process understanding for data-driven earth system science publication-title: Nature – volume: 577 start-page: 706 year: 2020 end-page: 710 ident: b33 article-title: Improved protein structure prediction using potentials from deep learning publication-title: Nature – year: 2003 ident: b8 publication-title: The diffuse interface approach in materials science: Thermodynamic concepts and applications of phase-field models – volume: 9 start-page: 1406 year: 2022 end-page: 1426 ident: b26 article-title: Complex-valued neural networks: A comprehensive survey publication-title: IEEE/CAA Journal of Automatica Sinica – year: 2024 ident: b9 article-title: Featup: A model-agnostic framework for features at any resolution – year: 2022 ident: b14 article-title: How important are activation functions in regression and classification? A survey, performance comparison, and future directions – volume: 3 start-page: 218 year: 2021 end-page: 229 ident: b18 article-title: Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators publication-title: Nature Machine Intelligence – volume: 378 start-page: 686 year: 2019 end-page: 707 ident: b29 article-title: Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations publication-title: Journal of Computational Physics – year: 2024 ident: b27 article-title: Model comparisons: Xnet outperforms KAN – year: 2020 ident: b37 article-title: Visual transformers: Token-based image representation and processing for computer vision – year: 2017 ident: b30 article-title: Searching for activation functions – volume: 375 start-page: 1339 year: 2018 end-page: 1364 ident: b34 article-title: DGM: A deep learning algorithm for solving partial differential equations publication-title: Journal of Computational Physics – year: 2023 ident: b41 article-title: PINNsFormer: A transformer-based framework for physics-informed neural networks – year: 2012 ident: b12 publication-title: Complex-valued neural networks – year: 2024 ident: b23 article-title: Three decades of activations: A comprehensive survey of 400 activation functions for neural networks – volume: 94 start-page: 103 year: 2017 end-page: 114 ident: b38 article-title: Error bounds for approximations with deep ReLU networks publication-title: Neural Networks – reference: Begion, Y., et al. (2018). Deep Complex Networks. In – start-page: 1 year: 2022 end-page: 87 ident: b24 article-title: Deep reinforcement learning in computer vision: a comprehensive survey publication-title: Artificial Intelligence Review – volume: 503 start-page: 92 year: 2022 end-page: 108 ident: b6 article-title: Activation functions in deep learning: A comprehensive survey and benchmark publication-title: Neurocomputing – volume: 148 start-page: 48 year: 2022 end-page: 65 ident: b4 article-title: Discovering parametric activation functions publication-title: Neural Networks – volume: 17 start-page: 1 year: 2024 end-page: 11 ident: b20 article-title: modSwish: a new activation function for neural network publication-title: Evolutionary Intelligence – volume: 503 start-page: 92 year: 2022 ident: 10.1016/j.neunet.2025.107375_b6 article-title: Activation functions in deep learning: A comprehensive survey and benchmark publication-title: Neurocomputing doi: 10.1016/j.neucom.2022.06.111 – volume: 17 start-page: 1 year: 2024 ident: 10.1016/j.neunet.2025.107375_b20 article-title: modSwish: a new activation function for neural network publication-title: Evolutionary Intelligence doi: 10.1007/s12065-024-00908-9 – year: 2003 ident: 10.1016/j.neunet.2025.107375_b8 – year: 2023 ident: 10.1016/j.neunet.2025.107375_b41 – year: 2022 ident: 10.1016/j.neunet.2025.107375_b14 – ident: 10.1016/j.neunet.2025.107375_b28 – volume: 577 start-page: 706 issue: 7792 year: 2020 ident: 10.1016/j.neunet.2025.107375_b33 article-title: Improved protein structure prediction using potentials from deep learning publication-title: Nature doi: 10.1038/s41586-019-1923-7 – volume: 115 start-page: 8505 issue: 34 year: 2018 ident: 10.1016/j.neunet.2025.107375_b11 article-title: Solving high-dimensional partial differential equations using deep learning publication-title: Proceedings of the National Academy of Sciences doi: 10.1073/pnas.1718942115 – start-page: 1120 year: 2016 ident: 10.1016/j.neunet.2025.107375_b1 article-title: Unitary evolution recurrent neural networks – year: 2016 ident: 10.1016/j.neunet.2025.107375_b10 – volume: 92 start-page: 80 year: 2022 ident: 10.1016/j.neunet.2025.107375_b16 article-title: Locally adaptive activation functions with applications to deep and physics-informed neural networks publication-title: Journal of Scientific Computing – volume: 11 start-page: 284 year: 2023 ident: 10.1016/j.neunet.2025.107375_b22 article-title: NeuroSCA: Evolving activation functions for side-channel analysis publication-title: IEEE Access doi: 10.1109/ACCESS.2022.3232064 – volume: 404 year: 2020 ident: 10.1016/j.neunet.2025.107375_b15 article-title: Adaptive activation functions accelerate convergence in deep and physics-informed neural networks publication-title: Journal of Computational Physics doi: 10.1016/j.jcp.2019.109136 – volume: 521 start-page: 436 issue: 7553 year: 2015 ident: 10.1016/j.neunet.2025.107375_b25 article-title: Deep learning publication-title: Nature doi: 10.1038/nature14539 – year: 2024 ident: 10.1016/j.neunet.2025.107375_b27 – volume: 378 start-page: 686 year: 2019 ident: 10.1016/j.neunet.2025.107375_b29 article-title: Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations publication-title: Journal of Computational Physics doi: 10.1016/j.jcp.2018.10.045 – start-page: 11953 year: 2021 ident: 10.1016/j.neunet.2025.107375_b39 article-title: Improving gradient regularization using complex-valued neural networks – ident: 10.1016/j.neunet.2025.107375_b3 – volume: 28 start-page: 2002 issue: 5 year: 2020 ident: 10.1016/j.neunet.2025.107375_b13 article-title: Extended physics-informed neural networks (XPINNs): A generalized space-time domain decomposition based deep learning framework for nonlinear partial differential equations publication-title: Communications in Computational Physics doi: 10.4208/cicp.OA-2020-0164 – year: 2024 ident: 10.1016/j.neunet.2025.107375_b40 – year: 2024 ident: 10.1016/j.neunet.2025.107375_b23 – volume: 3 start-page: 218 issue: 3 year: 2021 ident: 10.1016/j.neunet.2025.107375_b18 article-title: Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators publication-title: Nature Machine Intelligence doi: 10.1038/s42256-021-00302-5 – year: 2017 ident: 10.1016/j.neunet.2025.107375_b30 – volume: 132 year: 2023 ident: 10.1016/j.neunet.2025.107375_b21 article-title: A comprehensive review of object detection with deep learning publication-title: Digital Signal Processing doi: 10.1016/j.dsp.2022.103812 – year: 2020 ident: 10.1016/j.neunet.2025.107375_b37 – year: 2024 ident: 10.1016/j.neunet.2025.107375_b9 – volume: 566 start-page: 195 issue: 7743 year: 2019 ident: 10.1016/j.neunet.2025.107375_b31 article-title: Deep learning and process understanding for data-driven earth system science publication-title: Nature doi: 10.1038/s41586-019-0912-1 – year: 2021 ident: 10.1016/j.neunet.2025.107375_b35 article-title: A survey on efficient realization of activation functions of artificial neural network – year: 2012 ident: 10.1016/j.neunet.2025.107375_b12 – start-page: 1 year: 2022 ident: 10.1016/j.neunet.2025.107375_b24 article-title: Deep reinforcement learning in computer vision: a comprehensive survey publication-title: Artificial Intelligence Review – volume: 596 start-page: 583 issue: 7873 year: 2021 ident: 10.1016/j.neunet.2025.107375_b19 article-title: Highly accurate protein structure prediction with AlphaFold publication-title: Nature doi: 10.1038/s41586-021-03819-2 – volume: 33 start-page: 14243 year: 2020 ident: 10.1016/j.neunet.2025.107375_b5 article-title: Rational neural networks publication-title: Advances in Neural Information Processing Systems – volume: 16 start-page: 2349 issue: 8 year: 2018 ident: 10.1016/j.neunet.2025.107375_b7 article-title: A priori estimates and PochHammer-Chree expansions for deep neural networks publication-title: Communications in Mathematical Sciences – volume: 94 start-page: 103 year: 2017 ident: 10.1016/j.neunet.2025.107375_b38 article-title: Error bounds for approximations with deep ReLU networks publication-title: Neural Networks doi: 10.1016/j.neunet.2017.07.002 – volume: 9 start-page: 1406 issue: 8 year: 2022 ident: 10.1016/j.neunet.2025.107375_b26 article-title: Complex-valued neural networks: A comprehensive survey publication-title: IEEE/CAA Journal of Automatica Sinica doi: 10.1109/JAS.2022.105743 – volume: 323 start-page: 533 issue: 6088 year: 1986 ident: 10.1016/j.neunet.2025.107375_b32 article-title: Learning representations by back-propagating errors publication-title: Nature doi: 10.1038/323533a0 – start-page: 2146 year: 2009 ident: 10.1016/j.neunet.2025.107375_b17 article-title: What is the best multi-stage architecture for object recognition? – volume: 375 start-page: 1339 year: 2018 ident: 10.1016/j.neunet.2025.107375_b34 article-title: DGM: A deep learning algorithm for solving partial differential equations publication-title: Journal of Computational Physics doi: 10.1016/j.jcp.2018.08.029 – year: 2023 ident: 10.1016/j.neunet.2025.107375_b2 – year: 2024 ident: 10.1016/j.neunet.2025.107375_b36 – volume: 148 start-page: 48 year: 2022 ident: 10.1016/j.neunet.2025.107375_b4 article-title: Discovering parametric activation functions publication-title: Neural Networks doi: 10.1016/j.neunet.2022.01.001 |
| SSID | ssj0006843 |
| Score | 2.4739628 |
| Snippet | We have developed a novel activation function, named the Cauchy Activation Function. This function is derived from the Cauchy Integral Theorem in complex... |
| SourceID | proquest pubmed crossref elsevier |
| SourceType | Aggregation Database Index Database Publisher |
| StartPage | 107375 |
| SubjectTerms | Algorithms Cauchy Integral Theorem Humans Image classification Neural Networks, Computer Physics-Informed Neural Networks |
| Title | Cauchy activation function and XNet |
| URI | https://dx.doi.org/10.1016/j.neunet.2025.107375 https://www.ncbi.nlm.nih.gov/pubmed/40157236 https://www.proquest.com/docview/3184019358 |
| Volume | 188 |
| WOSCitedRecordID | wos001460990100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVESC databaseName: Elsevier SD Freedom Collection Journals 2021 customDbUrl: eissn: 1879-2782 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0006843 issn: 0893-6080 databaseCode: AIEXJ dateStart: 19950101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1La9wwEBZN0kMvbfretgku7VXBlmw9jiGkJCUshaZgejGyLLGbgDZk1yH99x1ZkjdtE_qAXoyRsSTmE6NPo3kg9N4SoolVJa5Y2eFSMIGFKlpMdMekyC2zZqhacsKnU1HX8lOs5Lkcyglw58T1tbz4r1BDG4DtQ2f_Au6xU2iAdwAdngA7PP8I-APV69m3IUnGVfQkhL1rldyO66n5wRzvs3MATC64g48M-2S45a_n49qpg1ft19l8toi73U178xE0nvfuphGBVKMLW7RspeiWtSvRoIAkxSwPlZbW2lLcqnmDEeBsz5keZrznB4FGTkNdlJ9yWn_2XfuegYD5cPx8A20RXklQS1v7x4f1x3EzZSI4PqappOjHwUXv17HuYhd3nR4GFnG6jR5G-p_tB9geo3vGPUGPUmmNLGrap-hdQDFbo5glFDNAMfMoPkNfPhyeHhzhWM8Ca1rQFSZUGiC8tLCFKVthVUdsRRhpOe1IzlnXdUoWutQKiJdWPpEQ6eDEKSs4IzJT0udo0y2ceYkyIVvgEVKSopWl5aq1xuSibeH4mWvG7AThJIfmIqQtaZI_31kT5NZ4uTVBbhPEk7CaSL0CpWoA39_8-TbJtgHN5K-blDOLftlQbzwo_D37BL0IQh_nAh8qTih79c_jvkYP1gv5DdpcXfZmB93XV6v58nIXbfBa7MbF9B1lkmcc |
| linkProvider | Elsevier |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Cauchy+activation+function+and+XNet&rft.jtitle=Neural+networks&rft.au=Li%2C+Xin&rft.au=Xia%2C+Zhihong&rft.au=Zhang%2C+Hongkun&rft.date=2025-08-01&rft.pub=Elsevier+Ltd&rft.issn=0893-6080&rft.volume=188&rft_id=info:doi/10.1016%2Fj.neunet.2025.107375&rft.externalDocID=S0893608025002540 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0893-6080&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0893-6080&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0893-6080&client=summon |