Neural network with unbounded activation functions is universal approximator

This paper presents an investigation of the approximation property of neural networks with unbounded activation functions, such as the rectified linear unit (ReLU), which is the new de-facto standard of deep learning. The ReLU network can be analyzed by the ridgelet transform with respect to Lizorki...

Full description

Saved in:
Bibliographic Details
Published in:Applied and computational harmonic analysis Vol. 43; no. 2; pp. 233 - 268
Main Authors: Sonoda, Sho, Murata, Noboru
Format: Journal Article
Language:English
Published: Elsevier Inc 01.09.2017
Subjects:
ISSN:1063-5203, 1096-603X
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Be the first to leave a comment!
You must be logged in first