The universal approximation theorem for complex-valued neural networks

We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider feedforward networks with a complex activation function σ:C→C in which each neuron performs the operation CN→C,z↦σ(b+wTz) with weights w∈CN and a bias...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Applied and computational harmonic analysis Ročník 64; s. 33 - 61
Hlavní autor: Voigtlaender, Felix
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Inc 01.05.2023
Témata:
ISSN:1063-5203, 1096-603X
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider feedforward networks with a complex activation function σ:C→C in which each neuron performs the operation CN→C,z↦σ(b+wTz) with weights w∈CN and a bias b∈C. We completely characterize those activation functions σ for which the associated complex networks have the universal approximation property, meaning that they can uniformly approximate any continuous function on any compact subset of Cd arbitrarily well. Unlike the classical case of real networks, the set of “good activation functions”—which give rise to networks with the universal approximation property—differs significantly depending on whether one considers deep networks or shallow networks: For deep networks with at least two hidden layers, the universal approximation property holds as long as σ is neither a polynomial, a holomorphic function, nor an antiholomorphic function. Shallow networks, on the other hand, are universal if and only if the real part or the imaginary part of σ is not a polyharmonic function.
AbstractList We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider feedforward networks with a complex activation function σ:C→C in which each neuron performs the operation CN→C,z↦σ(b+wTz) with weights w∈CN and a bias b∈C. We completely characterize those activation functions σ for which the associated complex networks have the universal approximation property, meaning that they can uniformly approximate any continuous function on any compact subset of Cd arbitrarily well. Unlike the classical case of real networks, the set of “good activation functions”—which give rise to networks with the universal approximation property—differs significantly depending on whether one considers deep networks or shallow networks: For deep networks with at least two hidden layers, the universal approximation property holds as long as σ is neither a polynomial, a holomorphic function, nor an antiholomorphic function. Shallow networks, on the other hand, are universal if and only if the real part or the imaginary part of σ is not a polyharmonic function.
Author Voigtlaender, Felix
Author_xml – sequence: 1
  givenname: Felix
  orcidid: 0000-0002-5061-2756
  surname: Voigtlaender
  fullname: Voigtlaender, Felix
  email: felix.voigtlaender@ku.de
  organization: Department of Mathematics, Technical University of Munich, 85748 Garching bei München, Germany
BookMark eNp9kLFOwzAQhi1UJNrCCzDlBRLOduK6EguqKEWqxFIkNsuxL6pLGkd2Wsrbk1Amhk53w32_7v8mZNT4Bgm5p5BRoOJhl2mz1RkDxjLKMgB2RcYU5iIVwD9Gwy54WjDgN2QS4w6A0ryYj8lys8Xk0LgjhqjrRLdt8Ce3153zTdJt0QfcJ5UPifH7tsZTetT1AW3S4CH09w12Xz58xltyXek64t3fnJL35fNmsUrXby-vi6d1ajhAl3JLNSsorzSYHDWrqCiFlVaXojCsxELkdm5lPsNyxsqZKNFUrMylrHKQklM-JfKca4KPMWCljOt-n-2CdrWioAYfaqcGH2rwoShTvY8eZf_QNvRFw_dl6PEMYV_q6DCoaBw2Bq0LaDplvbuE_wDZbn1Y
CitedBy_id crossref_primary_10_1088_2515_7620_ad810f
crossref_primary_10_1088_1361_6420_ace9d4
crossref_primary_10_1002_asjc_3631
crossref_primary_10_3390_math12132097
crossref_primary_10_1007_s00365_025_09713_8
crossref_primary_10_1038_s41467_024_45982_w
crossref_primary_10_3390_s24237516
crossref_primary_10_3390_pr11051460
crossref_primary_10_1109_LWC_2023_3309479
crossref_primary_10_3390_math12233704
crossref_primary_10_1016_j_engappai_2024_108352
crossref_primary_10_1186_s43593_025_00085_x
crossref_primary_10_1108_GS_06_2024_0070
crossref_primary_10_1109_ACCESS_2024_3413785
crossref_primary_10_1063_5_0254013
crossref_primary_10_1007_s44439_025_00002_7
crossref_primary_10_1016_j_neunet_2024_106632
crossref_primary_10_1140_epjs_s11734_024_01306_z
crossref_primary_10_1109_TAP_2024_3476915
crossref_primary_10_3390_app14135618
crossref_primary_10_1038_s41598_025_85440_1
crossref_primary_10_1016_j_trd_2024_104533
crossref_primary_10_1016_j_softx_2024_102017
crossref_primary_10_1109_JLT_2024_3466977
crossref_primary_10_3390_a17010014
crossref_primary_10_1016_j_neunet_2024_106922
crossref_primary_10_1038_s42005_025_02005_4
crossref_primary_10_1109_MSP_2024_3401621
crossref_primary_10_1121_10_0036384
crossref_primary_10_1093_gji_ggaf348
crossref_primary_10_61383_ejam_20253188
crossref_primary_10_1016_j_cma_2024_117406
Cites_doi 10.1016/j.neunet.2020.01.018
10.1016/0893-6080(91)90009-T
10.1090/proc/14789
10.1137/20M134695X
10.1007/BF02551274
10.1016/j.acha.2019.06.004
10.1162/089976603321891846
10.1016/0893-6080(89)90020-8
10.1142/S0129065795000299
10.1016/j.neunet.2018.08.019
10.1162/NECO_a_00824
10.1007/s00365-021-09543-4
10.1007/978-1-4471-7280-2
10.1137/0120005
10.1016/j.neunet.2017.07.002
10.1016/S0893-6080(05)80131-5
10.1038/nature14539
10.1162/neco.1996.8.1.164
10.1109/TNN.2006.875977
10.1145/3065386
10.1007/s00365-021-09546-1
ContentType Journal Article
Copyright 2022 Elsevier Inc.
Copyright_xml – notice: 2022 Elsevier Inc.
DBID AAYXX
CITATION
DOI 10.1016/j.acha.2022.12.002
DatabaseName CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Mathematics
EISSN 1096-603X
EndPage 61
ExternalDocumentID 10_1016_j_acha_2022_12_002
S1063520322001014
GroupedDBID --K
--M
.~1
0R~
1B1
1RT
1~.
1~5
23M
4.4
457
4G.
5GY
5VS
6I.
7-5
71M
8P~
9JN
AACTN
AAEDT
AAEDW
AAFTH
AAIAV
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AASFE
AAXUO
ABAOU
ABFNM
ABJNI
ABMAC
ABVKL
ABXDB
ABYKQ
ACAZW
ACDAQ
ACGFS
ACRLP
ADBBV
ADEZE
ADFGL
ADMUD
AEBSH
AEKER
AENEX
AEXQZ
AFKWA
AFTJW
AGHFR
AGUBO
AGYEJ
AHHHB
AIEXJ
AIGVJ
AIKHN
AITUG
AJBFU
AJOXV
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
ARUGR
ASPBG
AVWKF
AXJTR
AZFZN
BKOJK
BLXMC
CAG
COF
CS3
DM4
EBS
EFBJH
EFLBG
EJD
EO8
EO9
EP2
EP3
F5P
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-2
G-Q
GBLVA
HVGLF
HZ~
IHE
IXB
J1W
KOM
LG5
M26
M41
MCRUF
MHUIS
MO0
N9A
NCXOZ
O-L
O9-
OAUVE
OK1
OZT
P-8
P-9
P2P
PC.
Q38
R2-
RIG
ROL
RPZ
SDF
SDG
SDP
SES
SEW
SPC
SPCBC
SSW
SSZ
T5K
WUQ
XPP
ZMT
~G-
9DU
AATTM
AAXKI
AAYWO
AAYXX
ABWVN
ACLOT
ACRPL
ACVFH
ADCNI
ADNMO
ADVLN
AEIPS
AEUPX
AFJKZ
AFPUW
AGQPQ
AIGII
AIIUN
AKBMS
AKRWK
AKYEP
ANKPU
APXCP
CITATION
EFKBS
~HD
ID FETCH-LOGICAL-c300t-3d1a2513fa0c4ea2f16b6d8dab65c2be564d9d847eb72b76becf2b488f4088313
ISICitedReferencesCount 44
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000918021300001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1063-5203
IngestDate Sat Nov 29 07:05:09 EST 2025
Tue Nov 18 21:51:21 EST 2025
Fri Feb 23 02:38:37 EST 2024
IsPeerReviewed true
IsScholarly true
Keywords 30E10
Deep neural networks
41A30
41A63
31A30
Complex-valued neural networks
Universal approximation theorem
68T07
Holomorphic functions
Polyharmonic functions
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c300t-3d1a2513fa0c4ea2f16b6d8dab65c2be564d9d847eb72b76becf2b488f4088313
ORCID 0000-0002-5061-2756
PageCount 29
ParticipantIDs crossref_citationtrail_10_1016_j_acha_2022_12_002
crossref_primary_10_1016_j_acha_2022_12_002
elsevier_sciencedirect_doi_10_1016_j_acha_2022_12_002
PublicationCentury 2000
PublicationDate May 2023
2023-05-00
PublicationDateYYYYMMDD 2023-05-01
PublicationDate_xml – month: 05
  year: 2023
  text: May 2023
PublicationDecade 2020
PublicationTitle Applied and computational harmonic analysis
PublicationYear 2023
Publisher Elsevier Inc
Publisher_xml – name: Elsevier Inc
References Wolter, Yao (br0380) 2018; vol. 31
Krizhevsky, Sutskever, Hinton (br0200) 2017; 60
Evans (br0070) 2010; vol. 19
Zhou (br0420) 2020; 124
Hirose (br0110) 2003; vol. 5
Huang, Chen, Siew (br0140) 2006; 17
Stein, Shakarchi (br0320) 2003; vol. 2
Yarotsky, Zhevnerchuk (br0410) 2020; 33
Kidger, Lyons (br0180) 2020; vol. 125
Petersen, Voigtlaender (br0280) 2018; 108
Yarotsky (br0400) 2022; 55
Zhou (br0430) 2020; 48
Petersen, Voigtlaender (br0290) 2020; 148
Lu, Shen, Yang, Zhang (br0240) 2021; 53
Mhaskar (br0260) 1996; 8
Hornik (br0120) 1991; 4
Huang, Li, Chen, Siew (br0150) 2008; 71
Arena, Fortuna, Re, Xibilia (br0030) 1993
Kaup, Kaup (br0170) 1983; vol. 3
Stroock (br0330) 2008; vol. 354
Sutskever, Vinyals, Le (br0340) 2014
Hornik, Stinchcombe, White (br0130) 1989; 2
Rudin (br0300) 1976
Alt (br0010) 2016
LeCun, Bengio, Hinton (br0210) 2015; 521
Arena, Fortuna, Muscato, Xibilia (br0020) 1998; vol. 234
Rudin (br0310) 1991
Lu, Pu, Wang, Hu, Wang (br0250) 2017; 30
Yarotsky (br0390) 2017; 94
Glorot, Bordes, Bengio (br0090) 2011
Tygert, Bruna, Chintala, LeCun, Piantino, Szlam (br0360) 2016; 28
Balk (br0050) 1991; vol. 63
Cybenko (br0060) 1989; 2
Folland (br0080) 1999
Munkres (br0270) 2000
Virtue, Yu, Lustig (br0370) 2017
Huilgol (br0160) 1971; 20
Kim, Adalı (br0190) 2003; 15
Lin, Jegelka (br0230) 2018; 31
Leshno, Lin, Pinkus, Schocken (br0220) 1993; 6
Gribonval, Kutyniok, Nielsen, Voigtlaender (br0100) 2022; 55
Arena, Fortuna, Re, Xibilia (br0040) 1995; 6
Trabelsi, Bilaniuk, Zhang, Serdyuk, Subramanian, Santos, Mehri, Rostamzadeh, Bengio, Pal (br0350) 2018
Wolter (10.1016/j.acha.2022.12.002_br0380) 2018; vol. 31
Alt (10.1016/j.acha.2022.12.002_br0010) 2016
Hirose (10.1016/j.acha.2022.12.002_br0110) 2003; vol. 5
Yarotsky (10.1016/j.acha.2022.12.002_br0390) 2017; 94
Petersen (10.1016/j.acha.2022.12.002_br0290) 2020; 148
LeCun (10.1016/j.acha.2022.12.002_br0210) 2015; 521
Kidger (10.1016/j.acha.2022.12.002_br0180) 2020; vol. 125
Rudin (10.1016/j.acha.2022.12.002_br0310) 1991
Lin (10.1016/j.acha.2022.12.002_br0230) 2018; 31
Mhaskar (10.1016/j.acha.2022.12.002_br0260) 1996; 8
Petersen (10.1016/j.acha.2022.12.002_br0280) 2018; 108
Hornik (10.1016/j.acha.2022.12.002_br0120) 1991; 4
Zhou (10.1016/j.acha.2022.12.002_br0420) 2020; 124
Arena (10.1016/j.acha.2022.12.002_br0020) 1998; vol. 234
Arena (10.1016/j.acha.2022.12.002_br0030) 1993
Gribonval (10.1016/j.acha.2022.12.002_br0100) 2022; 55
Krizhevsky (10.1016/j.acha.2022.12.002_br0200) 2017; 60
Arena (10.1016/j.acha.2022.12.002_br0040) 1995; 6
Stein (10.1016/j.acha.2022.12.002_br0320) 2003; vol. 2
Glorot (10.1016/j.acha.2022.12.002_br0090) 2011
Kim (10.1016/j.acha.2022.12.002_br0190) 2003; 15
Sutskever (10.1016/j.acha.2022.12.002_br0340) 2014
Trabelsi (10.1016/j.acha.2022.12.002_br0350) 2018
Yarotsky (10.1016/j.acha.2022.12.002_br0400) 2022; 55
Lu (10.1016/j.acha.2022.12.002_br0250) 2017; 30
Huang (10.1016/j.acha.2022.12.002_br0140) 2006; 17
Balk (10.1016/j.acha.2022.12.002_br0050) 1991; vol. 63
Munkres (10.1016/j.acha.2022.12.002_br0270) 2000
Huang (10.1016/j.acha.2022.12.002_br0150) 2008; 71
Tygert (10.1016/j.acha.2022.12.002_br0360) 2016; 28
Evans (10.1016/j.acha.2022.12.002_br0070) 2010; vol. 19
Rudin (10.1016/j.acha.2022.12.002_br0300) 1976
Leshno (10.1016/j.acha.2022.12.002_br0220) 1993; 6
Stroock (10.1016/j.acha.2022.12.002_br0330) 2008; vol. 354
Folland (10.1016/j.acha.2022.12.002_br0080) 1999
Huilgol (10.1016/j.acha.2022.12.002_br0160) 1971; 20
Lu (10.1016/j.acha.2022.12.002_br0240) 2021; 53
Zhou (10.1016/j.acha.2022.12.002_br0430) 2020; 48
Kaup (10.1016/j.acha.2022.12.002_br0170) 1983; vol. 3
Yarotsky (10.1016/j.acha.2022.12.002_br0410) 2020; 33
Cybenko (10.1016/j.acha.2022.12.002_br0060) 1989; 2
Virtue (10.1016/j.acha.2022.12.002_br0370) 2017
Hornik (10.1016/j.acha.2022.12.002_br0130) 1989; 2
References_xml – year: 2016
  ident: br0010
  article-title: Linear Functional Analysis
  publication-title: Universitext
– year: 1991
  ident: br0310
  article-title: Functional Analysis
  publication-title: International Series in Pure and Applied Mathematics
– volume: 20
  start-page: 37
  year: 1971
  end-page: 39
  ident: br0160
  article-title: On Liouville's theorem for biharmonic functions
  publication-title: SIAM J. Appl. Math.
– volume: vol. 234
  year: 1998
  ident: br0020
  article-title: Neural Networks in Multidimensional Domains: Fundamentals and New Trends in Modelling and Control
– volume: vol. 5
  year: 2003
  ident: br0110
  article-title: Complex-Valued Neural Networks: Theories and Applications
– year: 1999
  ident: br0080
  article-title: Real Analysis
  publication-title: Pure and Applied Mathematics (New York)
– volume: 2
  start-page: 303
  year: 1989
  end-page: 314
  ident: br0060
  article-title: Approximation by superpositions of a sigmoidal function
  publication-title: Math. Control Signals Syst.
– volume: 33
  year: 2020
  ident: br0410
  article-title: The phase diagram of approximation rates for deep neural networks
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 55
  start-page: 407
  year: 2022
  end-page: 474
  ident: br0400
  article-title: Universal approximations of invariant maps by neural networks
  publication-title: Constr. Approx.
– volume: vol. 3
  year: 1983
  ident: br0170
  article-title: Holomorphic Functions of Several Variables
  publication-title: De Gruyter Studies in Mathematics
– volume: 2
  start-page: 359
  year: 1989
  end-page: 366
  ident: br0130
  article-title: Multilayer feedforward networks are universal approximators
  publication-title: Neural Netw.
– volume: vol. 354
  start-page: 164
  year: 2008
  end-page: 173
  ident: br0330
  article-title: Weyl's lemma, one of many
  publication-title: Groups and Analysis
– volume: 6
  year: 1995
  ident: br0040
  article-title: Multilayer perceptrons to approximate complex valued functions
  publication-title: Int. J. Neural Syst.
– year: 1993
  ident: br0030
  article-title: On the capability of neural networks with complex neurons in complex valued functions approximation
  publication-title: 1993 IEEE International Symposium on Circuits and Systems
– volume: 15
  start-page: 1641
  year: 2003
  end-page: 1666
  ident: br0190
  article-title: Approximation by fully complex multilayer perceptrons
  publication-title: Neural Comput.
– volume: vol. 2
  year: 2003
  ident: br0320
  article-title: Complex Analysis
  publication-title: Princeton Lectures in Analysis
– volume: 8
  year: 1996
  ident: br0260
  article-title: Neural networks for optimal approximation of smooth and analytic functions
  publication-title: Neural Comput.
– volume: 108
  year: 2018
  ident: br0280
  article-title: Optimal approximation of piecewise smooth functions using deep ReLU neural networks
  publication-title: Neural Netw.
– volume: 71
  year: 2008
  ident: br0150
  article-title: Incremental extreme learning machine with fully complex hidden nodes
  publication-title: Neurocomputing
– year: 2000
  ident: br0270
  article-title: Topology
– year: 2017
  ident: br0370
  article-title: Better than real: complex-valued neural nets for MRI fingerprinting
  publication-title: 2017 IEEE International Conference on Image Processing (ICIP)
– volume: 17
  start-page: 879
  year: 2006
  end-page: 892
  ident: br0140
  article-title: Universal approximation using incremental constructive feedforward networks with random hidden nodes
  publication-title: IEEE Trans. Neural Netw.
– volume: 55
  start-page: 259
  year: 2022
  end-page: 367
  ident: br0100
  article-title: Approximation spaces of deep neural networks
  publication-title: Constr. Approx.
– volume: 48
  start-page: 787
  year: 2020
  end-page: 794
  ident: br0430
  article-title: Universality of deep convolutional neural networks
  publication-title: Appl. Comput. Harmon. Anal.
– volume: 124
  start-page: 319
  year: 2020
  end-page: 327
  ident: br0420
  article-title: Theory of deep convolutional neural networks: downsampling
  publication-title: Neural Netw.
– volume: 148
  year: 2020
  ident: br0290
  article-title: Equivalence of approximation by convolutional neural networks and fully-connected networks
  publication-title: Proc. Am. Math. Soc.
– volume: 28
  year: 2016
  ident: br0360
  article-title: A mathematical motivation for complex-valued convolutional networks
  publication-title: Neural Comput.
– volume: vol. 19
  year: 2010
  ident: br0070
  article-title: Partial Differential Equations
  publication-title: Graduate Studies in Mathematics
– volume: 30
  start-page: 6231
  year: 2017
  end-page: 6239
  ident: br0250
  article-title: The expressive power of neural networks: a view from the width
  publication-title: Adv. Neural Inf. Process. Syst.
– year: 2018
  ident: br0350
  article-title: Deep complex networks
  publication-title: ICLR
– volume: 4
  start-page: 251
  year: 1991
  end-page: 257
  ident: br0120
  article-title: Approximation capabilities of multilayer feedforward networks
  publication-title: Neural Netw.
– volume: vol. 125
  start-page: 2306
  year: 2020
  end-page: 2327
  ident: br0180
  article-title: Universal approximation with deep narrow networks
  publication-title: Proceedings of Thirty Third Conference on Learning Theory
– volume: 60
  year: 2017
  ident: br0200
  article-title: ImageNet classification with deep convolutional neural networks
  publication-title: Commun. ACM
– volume: 94
  year: 2017
  ident: br0390
  article-title: Error bounds for approximations with deep ReLU networks
  publication-title: Neural Netw.
– volume: 521
  year: 2015
  ident: br0210
  article-title: Deep learning
  publication-title: Nature
– year: 2014
  ident: br0340
  article-title: Sequence to sequence learning with neural networks
  publication-title: Advances in Neural Information Processing Systems
– volume: 6
  start-page: 861
  year: 1993
  end-page: 867
  ident: br0220
  article-title: Multilayer feedforward networks with a nonpolynomial activation function can approximate any function
  publication-title: Neural Netw.
– year: 2011
  ident: br0090
  article-title: Deep sparse rectifier neural networks
  publication-title: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics
– volume: vol. 31
  year: 2018
  ident: br0380
  article-title: Complex gated recurrent neural networks
  publication-title: Advances in Neural Information Processing Systems
– volume: vol. 63
  year: 1991
  ident: br0050
  article-title: Polyanalytic Functions
  publication-title: Mathematical Research
– volume: 31
  start-page: 6169
  year: 2018
  end-page: 6178
  ident: br0230
  article-title: ResNet with one-neuron hidden layers is a universal approximator
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 53
  start-page: 5465
  year: 2021
  end-page: 5506
  ident: br0240
  article-title: Deep network approximation for smooth functions
  publication-title: SIAM J. Math. Anal.
– year: 1976
  ident: br0300
  article-title: Principles of Mathematical Analysis
  publication-title: International Series in Pure and Applied Mathematics
– volume: 124
  start-page: 319
  year: 2020
  ident: 10.1016/j.acha.2022.12.002_br0420
  article-title: Theory of deep convolutional neural networks: downsampling
  publication-title: Neural Netw.
  doi: 10.1016/j.neunet.2020.01.018
– volume: vol. 234
  year: 1998
  ident: 10.1016/j.acha.2022.12.002_br0020
– volume: vol. 19
  year: 2010
  ident: 10.1016/j.acha.2022.12.002_br0070
  article-title: Partial Differential Equations
– volume: 4
  start-page: 251
  issue: 2
  year: 1991
  ident: 10.1016/j.acha.2022.12.002_br0120
  article-title: Approximation capabilities of multilayer feedforward networks
  publication-title: Neural Netw.
  doi: 10.1016/0893-6080(91)90009-T
– volume: 148
  issue: 4
  year: 2020
  ident: 10.1016/j.acha.2022.12.002_br0290
  article-title: Equivalence of approximation by convolutional neural networks and fully-connected networks
  publication-title: Proc. Am. Math. Soc.
  doi: 10.1090/proc/14789
– volume: 30
  start-page: 6231
  year: 2017
  ident: 10.1016/j.acha.2022.12.002_br0250
  article-title: The expressive power of neural networks: a view from the width
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 53
  start-page: 5465
  issue: 5
  year: 2021
  ident: 10.1016/j.acha.2022.12.002_br0240
  article-title: Deep network approximation for smooth functions
  publication-title: SIAM J. Math. Anal.
  doi: 10.1137/20M134695X
– year: 1993
  ident: 10.1016/j.acha.2022.12.002_br0030
  article-title: On the capability of neural networks with complex neurons in complex valued functions approximation
– volume: 2
  start-page: 303
  issue: 4
  year: 1989
  ident: 10.1016/j.acha.2022.12.002_br0060
  article-title: Approximation by superpositions of a sigmoidal function
  publication-title: Math. Control Signals Syst.
  doi: 10.1007/BF02551274
– year: 2014
  ident: 10.1016/j.acha.2022.12.002_br0340
  article-title: Sequence to sequence learning with neural networks
– volume: 48
  start-page: 787
  issue: 2
  year: 2020
  ident: 10.1016/j.acha.2022.12.002_br0430
  article-title: Universality of deep convolutional neural networks
  publication-title: Appl. Comput. Harmon. Anal.
  doi: 10.1016/j.acha.2019.06.004
– year: 1976
  ident: 10.1016/j.acha.2022.12.002_br0300
  article-title: Principles of Mathematical Analysis
– volume: 15
  start-page: 1641
  issue: 7
  year: 2003
  ident: 10.1016/j.acha.2022.12.002_br0190
  article-title: Approximation by fully complex multilayer perceptrons
  publication-title: Neural Comput.
  doi: 10.1162/089976603321891846
– year: 1991
  ident: 10.1016/j.acha.2022.12.002_br0310
  article-title: Functional Analysis
– volume: vol. 2
  year: 2003
  ident: 10.1016/j.acha.2022.12.002_br0320
  article-title: Complex Analysis
– year: 2000
  ident: 10.1016/j.acha.2022.12.002_br0270
– volume: vol. 31
  year: 2018
  ident: 10.1016/j.acha.2022.12.002_br0380
  article-title: Complex gated recurrent neural networks
– volume: vol. 5
  year: 2003
  ident: 10.1016/j.acha.2022.12.002_br0110
– volume: 2
  start-page: 359
  issue: 5
  year: 1989
  ident: 10.1016/j.acha.2022.12.002_br0130
  article-title: Multilayer feedforward networks are universal approximators
  publication-title: Neural Netw.
  doi: 10.1016/0893-6080(89)90020-8
– year: 1999
  ident: 10.1016/j.acha.2022.12.002_br0080
  article-title: Real Analysis
– volume: 71
  issue: 4–6
  year: 2008
  ident: 10.1016/j.acha.2022.12.002_br0150
  article-title: Incremental extreme learning machine with fully complex hidden nodes
  publication-title: Neurocomputing
– volume: 6
  issue: 04
  year: 1995
  ident: 10.1016/j.acha.2022.12.002_br0040
  article-title: Multilayer perceptrons to approximate complex valued functions
  publication-title: Int. J. Neural Syst.
  doi: 10.1142/S0129065795000299
– year: 2017
  ident: 10.1016/j.acha.2022.12.002_br0370
  article-title: Better than real: complex-valued neural nets for MRI fingerprinting
– volume: 108
  year: 2018
  ident: 10.1016/j.acha.2022.12.002_br0280
  article-title: Optimal approximation of piecewise smooth functions using deep ReLU neural networks
  publication-title: Neural Netw.
  doi: 10.1016/j.neunet.2018.08.019
– volume: 28
  issue: 5
  year: 2016
  ident: 10.1016/j.acha.2022.12.002_br0360
  article-title: A mathematical motivation for complex-valued convolutional networks
  publication-title: Neural Comput.
  doi: 10.1162/NECO_a_00824
– volume: 55
  start-page: 259
  issue: 1
  year: 2022
  ident: 10.1016/j.acha.2022.12.002_br0100
  article-title: Approximation spaces of deep neural networks
  publication-title: Constr. Approx.
  doi: 10.1007/s00365-021-09543-4
– year: 2016
  ident: 10.1016/j.acha.2022.12.002_br0010
  article-title: Linear Functional Analysis
  doi: 10.1007/978-1-4471-7280-2
– year: 2011
  ident: 10.1016/j.acha.2022.12.002_br0090
  article-title: Deep sparse rectifier neural networks
– volume: 20
  start-page: 37
  year: 1971
  ident: 10.1016/j.acha.2022.12.002_br0160
  article-title: On Liouville's theorem for biharmonic functions
  publication-title: SIAM J. Appl. Math.
  doi: 10.1137/0120005
– volume: vol. 63
  year: 1991
  ident: 10.1016/j.acha.2022.12.002_br0050
  article-title: Polyanalytic Functions
– volume: vol. 354
  start-page: 164
  year: 2008
  ident: 10.1016/j.acha.2022.12.002_br0330
  article-title: Weyl's lemma, one of many
– volume: 33
  year: 2020
  ident: 10.1016/j.acha.2022.12.002_br0410
  article-title: The phase diagram of approximation rates for deep neural networks
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: vol. 3
  year: 1983
  ident: 10.1016/j.acha.2022.12.002_br0170
  article-title: Holomorphic Functions of Several Variables
– volume: 94
  year: 2017
  ident: 10.1016/j.acha.2022.12.002_br0390
  article-title: Error bounds for approximations with deep ReLU networks
  publication-title: Neural Netw.
  doi: 10.1016/j.neunet.2017.07.002
– year: 2018
  ident: 10.1016/j.acha.2022.12.002_br0350
  article-title: Deep complex networks
– volume: 6
  start-page: 861
  issue: 6
  year: 1993
  ident: 10.1016/j.acha.2022.12.002_br0220
  article-title: Multilayer feedforward networks with a nonpolynomial activation function can approximate any function
  publication-title: Neural Netw.
  doi: 10.1016/S0893-6080(05)80131-5
– volume: 521
  issue: 7553
  year: 2015
  ident: 10.1016/j.acha.2022.12.002_br0210
  article-title: Deep learning
  publication-title: Nature
  doi: 10.1038/nature14539
– volume: 31
  start-page: 6169
  year: 2018
  ident: 10.1016/j.acha.2022.12.002_br0230
  article-title: ResNet with one-neuron hidden layers is a universal approximator
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 8
  issue: 1
  year: 1996
  ident: 10.1016/j.acha.2022.12.002_br0260
  article-title: Neural networks for optimal approximation of smooth and analytic functions
  publication-title: Neural Comput.
  doi: 10.1162/neco.1996.8.1.164
– volume: 17
  start-page: 879
  issue: 4
  year: 2006
  ident: 10.1016/j.acha.2022.12.002_br0140
  article-title: Universal approximation using incremental constructive feedforward networks with random hidden nodes
  publication-title: IEEE Trans. Neural Netw.
  doi: 10.1109/TNN.2006.875977
– volume: 60
  issue: 6
  year: 2017
  ident: 10.1016/j.acha.2022.12.002_br0200
  article-title: ImageNet classification with deep convolutional neural networks
  publication-title: Commun. ACM
  doi: 10.1145/3065386
– volume: 55
  start-page: 407
  issue: 1
  year: 2022
  ident: 10.1016/j.acha.2022.12.002_br0400
  article-title: Universal approximations of invariant maps by neural networks
  publication-title: Constr. Approx.
  doi: 10.1007/s00365-021-09546-1
– volume: vol. 125
  start-page: 2306
  year: 2020
  ident: 10.1016/j.acha.2022.12.002_br0180
  article-title: Universal approximation with deep narrow networks
SSID ssj0011459
Score 2.572248
Snippet We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider...
SourceID crossref
elsevier
SourceType Enrichment Source
Index Database
Publisher
StartPage 33
SubjectTerms Complex-valued neural networks
Deep neural networks
Holomorphic functions
Polyharmonic functions
Universal approximation theorem
Title The universal approximation theorem for complex-valued neural networks
URI https://dx.doi.org/10.1016/j.acha.2022.12.002
Volume 64
WOSCitedRecordID wos000918021300001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVESC
  databaseName: Elsevier SD Freedom Collection Journals 2021
  customDbUrl:
  eissn: 1096-603X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0011459
  issn: 1063-5203
  databaseCode: AIEXJ
  dateStart: 20211209
  isFulltext: true
  titleUrlDefault: https://www.sciencedirect.com
  providerName: Elsevier
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LS8QwEA6LetCD-MQ3OXiTQpv0kV4EEUUFxYPK3krTJrqydhe3K_35Tl61PtGDl9ItbXbbLzvzTTrzDUL7AeXgCGTilb5UAUopPRblzBPKGDJWSsp93Wwiubpi_X563esdulqYl2FSVaxp0vG_Qg3HAGxVOvsHuNtB4QDsA-iwBdhh-2vgpybdQukAKM3wZmAKFG3Z4pPOLdTJ5KLxlNy3ygEQWoCjMmnhky5pdUzVlsCNp7VbQVS617qFTm61TRx-d6PBfT3MdZ86zY_FcNB0VxhIJ5_PGkWgMRCw-rRrNeOwY_aMloV1oEZc_ZNpNqsEjzBNHpTeEyF6GdYnb47IvXz_4J_arEGXkPaYqTEyNUYWkExric6SJErBqs0enZ_0L9r3SEGo2-W1d2DLpkyG38df8jU16dCNmyW0aOMEfGTwXUY9Ua2ghY56JHy6bCV3J6voFHDHLe74He7Y4o4Bd_wed2xwxw73NXR7enJzfObZJhleQX2_9mgZ5MBRqcz9IhQ5kUHM45KVOY-jgnARxWGZlsBBBE8IT2L4z0rCwWzLEBwMDeg6mqlGldhAmEeBgPA3DwWcL4uI-TzkTKRxyiEwL-gmCtzzyQqrIK8amQyz75HZRAftNWOjn_Lj2ZF77JllgIbZZTCLfrhu60_fso3m36b5Dpqpn6diF80VL_Vg8rxnp9ArtxN_bQ
linkProvider Elsevier
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+universal+approximation+theorem+for+complex-valued+neural+networks&rft.jtitle=Applied+and+computational+harmonic+analysis&rft.au=Voigtlaender%2C+Felix&rft.date=2023-05-01&rft.issn=1063-5203&rft.volume=64&rft.spage=33&rft.epage=61&rft_id=info:doi/10.1016%2Fj.acha.2022.12.002&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_acha_2022_12_002
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1063-5203&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1063-5203&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1063-5203&client=summon