A Universal Approximation Result for Difference of Log-Sum-Exp Neural Networks

We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node, referred to as log-sum-exp (LSE) network, is a smooth universal approxi...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transaction on neural networks and learning systems Ročník 31; číslo 12; s. 5603 - 5612
Hlavní autori: Calafiore, Giuseppe C., Gaubert, Stephane, Possieri, Corrado
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.12.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2162-237X, 2162-2388, 2162-2388
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node, referred to as log-sum-exp (LSE) network, is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of network maps to a family of subtraction-free ratios of generalized posynomials (GPOS), which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for a design that possesses a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing an effective optimization-based design. We illustrate the proposed approach by applying it to the data-driven design of a diet for a patient with type-2 diabetes and to a nonconvex optimization problem.
AbstractList We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node (LSE networks) is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of networks maps to a family of subtraction-free ratios of generalized posynomials, which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of Difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for design that possess a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing effective optimization-based design. We illustrate the proposed approach by applying it to data-driven design of a diet for a patient with type-2 diabetes.
We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node, referred to as log-sum-exp (LSE) network, is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of network maps to a family of subtraction-free ratios of generalized posynomials (GPOS), which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for a design that possesses a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing an effective optimization-based design. We illustrate the proposed approach by applying it to the data-driven design of a diet for a patient with type-2 diabetes and to a nonconvex optimization problem.We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node, referred to as log-sum-exp (LSE) network, is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of network maps to a family of subtraction-free ratios of generalized posynomials (GPOS), which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for a design that possesses a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing an effective optimization-based design. We illustrate the proposed approach by applying it to the data-driven design of a diet for a patient with type-2 diabetes and to a nonconvex optimization problem.
We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node, referred to as log-sum-exp (LSE) network, is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of network maps to a family of subtraction-free ratios of generalized posynomials (GPOS), which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for a design that possesses a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing an effective optimization-based design. We illustrate the proposed approach by applying it to the data-driven design of a diet for a patient with type-2 diabetes and to a nonconvex optimization problem.
Author Gaubert, Stephane
Calafiore, Giuseppe C.
Possieri, Corrado
Author_xml – sequence: 1
  givenname: Giuseppe C.
  orcidid: 0000-0002-6428-5653
  surname: Calafiore
  fullname: Calafiore, Giuseppe C.
  email: giuseppe.calafiore@polito.it
  organization: Dipartimento di Elettronica e Telecomunicazioni, Politecnico di Torino, Turin, Italy
– sequence: 2
  givenname: Stephane
  surname: Gaubert
  fullname: Gaubert, Stephane
  email: stephane.gaubert@inria.fr
  organization: Inria, Palaiseau, France
– sequence: 3
  givenname: Corrado
  orcidid: 0000-0003-2528-3935
  surname: Possieri
  fullname: Possieri, Corrado
  email: corrado.possieri@iasi.cnr.it
  organization: Consiglio Nazionale delle Ricerche, Istituto di Analisi dei Sistemi ed Informatica A. Ruberti, Rome, Italy
BackLink https://www.ncbi.nlm.nih.gov/pubmed/32167912$$D View this record in MEDLINE/PubMed
https://inria.hal.science/hal-02423871$$DView record in HAL
BookMark eNp9kU9v1DAQxS1UREvpFwAJReIChyz2OP98XLWFIkWpRFuJmzVxJuCSjRc7acu3x9td9tBDLVm27N_zeN57zQ5GNxJjbwVfCMHV5-umqa8WwIEvQJU5z8ULdgSigBRkVR3s9-WPQ3YSwi2Po-B5kalX7FDGy1IJOGLNMrkZ7R35gEOyXK-9e7ArnKwbk-8U5mFKeueTM9v35Gk0lLg-qd3P9GpepecP66Sh2UdlQ9O987_DG_ayxyHQyW49Zjdfzq9PL9L68uu302WdGlnlU0pEXVegQFMYIGwxzkqhQdm3mJPgXYd5iUgVIhRt3wIYGbvmqBBk3stj9mn77i8c9NrHL_u_2qHVF8tab844ZNGHUtyJyH7csrG5PzOFSa9sMDQMOJKbg44elVJCpjbohyforZv9GDvRkBUlqKoAFan3O2puV9Tt6_93NQKwBYx3IXjq94jgepOefkxPb9LTu_SiqHoiMnZ6TGLyaIfnpe-2UhuN3ddSXILMuPwHB1amvw
CODEN ITNNAL
CitedBy_id crossref_primary_10_1109_LCSYS_2020_3032083
crossref_primary_10_1109_TAP_2021_3111299
crossref_primary_10_1109_TNNLS_2025_3570807
crossref_primary_10_1007_s10898_023_01272_1
crossref_primary_10_1109_TFUZZ_2022_3160614
crossref_primary_10_1109_TCSVT_2023_3247944
crossref_primary_10_1016_j_neunet_2023_11_014
crossref_primary_10_1016_j_cma_2023_116333
crossref_primary_10_3934_mfc_2024046
crossref_primary_10_1109_TNNLS_2024_3378697
crossref_primary_10_1016_j_neucom_2022_09_108
crossref_primary_10_3233_JIFS_211417
crossref_primary_10_1134_S0361768823100080
crossref_primary_10_1007_s00020_024_02769_4
crossref_primary_10_1007_s11227_021_04038_2
crossref_primary_10_1109_JPROC_2021_3065238
crossref_primary_10_1109_LCOMM_2021_3098750
crossref_primary_10_1109_TETCI_2024_3502463
crossref_primary_10_1109_TNNLS_2021_3105732
crossref_primary_10_1109_TNNLS_2022_3190198
crossref_primary_10_1002_spe_3408
crossref_primary_10_3390_ma16093430
crossref_primary_10_1109_TIA_2023_3296065
Cites_doi 10.1137/0111030
10.1007/s10479-004-5022-1
10.1109/22.390193
10.1007/s10107-018-1235-y
10.1007/978-3-642-54455-2_1
10.1109/87.221350
10.1090/S0002-9939-1980-0553381-X
10.1142/S0129065789000414
10.1007/s10208-014-9231-y
10.1016/j.jfa.2011.09.003
10.1017/CBO9780511804441
10.1016/0893-6080(89)90020-8
10.1007/978-3-319-57240-6_1
10.1109/TNNLS.2019.2910417
10.1504/IJMMNO.2013.055204
10.1007/978-3-030-20867-7_24
10.1007/s11634-008-0030-7
10.1080/02331931003770411
10.1109/TCSI.2004.834521
10.1109/TBME.2007.893506
10.1007/978-3-0348-8268-2_8
10.1007/BF02551274
10.1137/0801001
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Attribution
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
– notice: Attribution
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
1XC
DOI 10.1109/TNNLS.2020.2975051
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library
CrossRef
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Calcium & Calcified Tissue Abstracts
Ceramic Abstracts
Chemoreception Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
Hyper Article en Ligne (HAL)
DatabaseTitle CrossRef
PubMed
Materials Research Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Materials Business File
Aerospace Database
Engineered Materials Abstracts
Biotechnology Research Abstracts
Chemoreception Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Civil Engineering Abstracts
Aluminium Industry Abstracts
Electronics & Communications Abstracts
Ceramic Abstracts
Neurosciences Abstracts
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Solid State and Superconductivity Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
Corrosion Abstracts
MEDLINE - Academic
DatabaseTitleList
MEDLINE - Academic
Materials Research Database

PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
Mathematics
EISSN 2162-2388
EndPage 5612
ExternalDocumentID oai:HAL:hal-02423871v1
32167912
10_1109_TNNLS_2020_2975051
9032340
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: PGMO program of EDF and Fondation Mathématique Jacques Hadamard
  funderid: 10.13039/501100007493
– fundername: LabEx LMH through the “Investissement d’avenir”
  grantid: ANR-11-LABX-0056-LMH
  funderid: 10.13039/501100004100
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
ACPRK
AENEX
AFRAH
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IFIPE
IPLJI
JAVBF
M43
MS~
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
NPM
RIG
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
1XC
ID FETCH-LOGICAL-c385t-eeedd6a1ac6c2eabaeab89aca3fba5e10dda57aae8aa26bfb22c31090a9a235f3
IEDL.DBID RIE
ISICitedReferencesCount 34
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000595533300044&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2162-237X
2162-2388
IngestDate Tue Oct 14 21:00:21 EDT 2025
Thu Oct 02 11:37:01 EDT 2025
Mon Jun 30 03:09:32 EDT 2025
Thu Jan 02 22:57:52 EST 2025
Sat Nov 29 01:40:05 EST 2025
Tue Nov 18 22:33:42 EST 2025
Wed Aug 27 02:33:58 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 12
Keywords LSE networks
DC programming
Subtraction-freeexpressions
Surrogate models
Feedforward neural networks
Data-driven optimization
Universal ap-proximation
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
Attribution: http://creativecommons.org/licenses/by
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c385t-eeedd6a1ac6c2eabaeab89aca3fba5e10dda57aae8aa26bfb22c31090a9a235f3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-2528-3935
0000-0002-6428-5653
PMID 32167912
PQID 2467298629
PQPubID 85436
PageCount 10
ParticipantIDs pubmed_primary_32167912
proquest_miscellaneous_2377332491
proquest_journals_2467298629
crossref_primary_10_1109_TNNLS_2020_2975051
crossref_citationtrail_10_1109_TNNLS_2020_2975051
hal_primary_oai_HAL_hal_02423871v1
ieee_primary_9032340
PublicationCentury 2000
PublicationDate 2020-12-01
PublicationDateYYYYMMDD 2020-12-01
PublicationDate_xml – month: 12
  year: 2020
  text: 2020-12-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Piscataway
PublicationTitle IEEE transaction on neural networks and learning systems
PublicationTitleAbbrev TNNLS
PublicationTitleAlternate IEEE Trans Neural Netw Learn Syst
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References zhang (ref12) 2018; 80
ref14
ref30
ref11
ref2
ref1
ref17
ref16
ref18
ref24
ref26
ref25
ref20
ref22
ref21
ref27
ref29
ref8
ref7
ovchinnikov (ref13) 2002; 43
akian (ref19) 2018
ref9
ref4
ref3
itenberg (ref10) 2007
ref6
ref5
goodfellow (ref15) 2013; 28
bertsekas (ref23) 1999
ripsin (ref28) 2009; 79
References_xml – ident: ref21
  doi: 10.1137/0111030
– year: 1999
  ident: ref23
  publication-title: Nonlinear Programming
– ident: ref24
  doi: 10.1007/s10479-004-5022-1
– volume: 80
  start-page: 5824
  year: 2018
  ident: ref12
  article-title: Tropical geometry of deep neural networks
  publication-title: Proc 35th Int Conf Mach Learn
– ident: ref3
  doi: 10.1109/22.390193
– ident: ref27
  doi: 10.1007/s10107-018-1235-y
– volume: 79
  start-page: 29
  year: 2009
  ident: ref28
  article-title: Management of blood glucose in type 2 diabetes mellitus
  publication-title: Amer Family Phys
– ident: ref26
  doi: 10.1007/978-3-642-54455-2_1
– ident: ref4
  doi: 10.1109/87.221350
– ident: ref18
  doi: 10.1090/S0002-9939-1980-0553381-X
– ident: ref5
  doi: 10.1142/S0129065789000414
– ident: ref8
  doi: 10.1007/s10208-014-9231-y
– volume: 28
  start-page: iii-1319
  year: 2013
  ident: ref15
  article-title: Maxout networks
  publication-title: Proc 30th Int Conf Int Conf Mach Learn (ICML)
– ident: ref17
  doi: 10.1016/j.jfa.2011.09.003
– ident: ref6
  doi: 10.1017/CBO9780511804441
– ident: ref2
  doi: 10.1016/0893-6080(89)90020-8
– volume: 43
  start-page: 297
  year: 2002
  ident: ref13
  article-title: Max-min representations of piecewise linear functions
  publication-title: Contrib Algebra Geometry
– start-page: 225
  year: 2018
  ident: ref19
  article-title: Minimax representation of nonexpansive functions and application to zero-sum recursive games
  publication-title: J Convex Anal
– ident: ref11
  doi: 10.1007/978-3-319-57240-6_1
– ident: ref7
  doi: 10.1109/TNNLS.2019.2910417
– ident: ref30
  doi: 10.1504/IJMMNO.2013.055204
– ident: ref16
  doi: 10.1007/978-3-030-20867-7_24
– year: 2007
  ident: ref10
  publication-title: Tropical algebraic geometry ser Oberwolfach seminars
– ident: ref25
  doi: 10.1007/s11634-008-0030-7
– ident: ref20
  doi: 10.1080/02331931003770411
– ident: ref14
  doi: 10.1109/TCSI.2004.834521
– ident: ref29
  doi: 10.1109/TBME.2007.893506
– ident: ref9
  doi: 10.1007/978-3-0348-8268-2_8
– ident: ref1
  doi: 10.1007/BF02551274
– ident: ref22
  doi: 10.1137/0801001
SSID ssj0000605649
Score 2.5448906
Snippet We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the...
SourceID hal
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 5603
SubjectTerms Algorithms
Artificial neural networks
Computer Science
Continuity (mathematics)
Convex functions
Data-driven optimization
Design
Design optimization
Diabetes mellitus
difference of convex (DC) programming
Feedforward neural networks
feedforward neural networks (FFNs)
log-sum-exp (LSE) networks
Machine Learning
Mathematical analysis
Mathematical models
Mathematics
Neural networks
Numerical methods
Optimization
Optimization and Control
Subtraction
subtraction-free expressions
surrogate models
Transforms
universal approximation
Title A Universal Approximation Result for Difference of Log-Sum-Exp Neural Networks
URI https://ieeexplore.ieee.org/document/9032340
https://www.ncbi.nlm.nih.gov/pubmed/32167912
https://www.proquest.com/docview/2467298629
https://www.proquest.com/docview/2377332491
https://inria.hal.science/hal-02423871
Volume 31
WOSCitedRecordID wos000595533300044&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2162-2388
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000605649
  issn: 2162-237X
  databaseCode: RIE
  dateStart: 20120101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1La9wwEB6SUEouTdv0sW0S1NJbq0SW1pZ1XPIgh8WUJi17M3o5DaTrkF2H_PyOZNlQaAs5GIw9kgZ_I2tGmgfAJ-ly56zRdJobQaeSa1oqXdJGSq1tIayM3oQ_5rKqysVCfd2AL2MsjPc-Op_5w3Abz_Jda7uwVXakmOBiigb6ppSyj9Ua91MY6uVF1HZ5VnDKhVwMMTJMHV1W1fwCrUHODkMoKUriNjwVPJxBZPyPJWnzZ3CIjJVW_q10xsXnbOdxbD-HZ0nJJLNeKl7Ahl--hJ2hgANJ83kXqhlJjhmBOmQXf7juQxnJN7_qbtYEVVpykmqoWE_ahszbK3rR_aKnD7ckZPbAllXvSr56Bd_PTi-Pz2kqsECtKPM1RU6dK3SGoFjutdF4IVJWi8bo3GfMOZ0jZr7UmhemMZzbkEmUaaW5yBvxGraW7dK_BZJ7YabM2iyk9HNOmcw4hsS8dPgrVnYC2fCNa5uyj4ciGDd1tEKYqiNEdYCoThBN4PPY5rbPvfFf6o8I3UgY0mafz-Z1eBb0EIGW4T0S7QZ8RqoEzQT2BqTrNIdXNUfGuUKLT03gw_gaZ184UtFL33ZII6QUqJMq7PlNLyFj34N4vfv7mO9hO7Dfu8bswdb6rvP78MTer69Xdwco4ovyIIr4b9lJ9Pg
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB6VgqAXCpTHlgIGcaNuEzuJ4-MKWi0iRIguaG-WYztQqWyq7qbqz-_YeUiVChKHSFEydib5xvGMPQ-A98Km1ppK0yStOE0E0zSXOqe1EFqbjBsRvAl_FqIs88VCftuA_TEWxjkXnM_cgT8Ne_m2Ma1fKjuUEWc8QQP9bpokLO6itcYVlQg18yzouyzOGGVcLIYomUgezsuyOEF7kEUHPpgUZXEL7nPmdyFidmNSuvPbu0SGWit_VzvD9HO8_X-MP4KHvZpJpp1cPIYNt3wC20MJB9KP6B0op6R3zfDUPr_41WkXzEi-u1V7tiao1JJPfRUV40hTk6L5RU_aP_To6pz43B7YsuycyVdP4cfx0fzjjPYlFqjhebqmyKm1mY4RFsOcrjQeiJXRvK506uLIWp0iai7XmmVVXTFmfC7RSEvNeFrzZ7C5bJbuBZDU8SqJjIl9Uj9rZRVXNkJillv8GUszgXj4xsr0-cd9GYwzFeyQSKoAkfIQqR6iCXwY25x32Tf-Sf0OoRsJfeLs2bRQ_prXRDjahpdItOPxGal6aCawNyCt-lG8UgwZZxJtPjmBt-NtHH9-U0UvXdMiDReCo1YqsefnnYSMfQ_itXv7M9_Ag9n8a6GKz-WXl7DlX6VzlNmDzfVF617BPXO5Pl1dvA6Cfg311PdX
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Universal+Approximation+Result+for+Difference+of+Log-Sum-Exp+Neural+Networks&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Calafiore%2C+Giuseppe+C&rft.au=Gaubert%2C+Stephane&rft.au=Possieri%2C+Corrado&rft.date=2020-12-01&rft.eissn=2162-2388&rft.volume=31&rft.issue=12&rft.spage=5603&rft_id=info:doi/10.1109%2FTNNLS.2020.2975051&rft_id=info%3Apmid%2F32167912&rft.externalDocID=32167912
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon