Constructive Approximation to Multivariate Function by Decay RBF Neural Network

It is well known that single hidden layer feedforward networks with radial basis function (RBF) kernels are universal approximators when all the parameters of the networks are obtained through all kinds of algorithms. However, as observed in most neural network implementations, tuning all the parame...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on neural networks Ročník 21; číslo 9; s. 1517 - 1523
Hlavní autoři: Hou, Muzhou, Han, Xuli
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York, NY IEEE 01.09.2010
Institute of Electrical and Electronics Engineers
Témata:
ISSN:1045-9227, 1941-0093, 1941-0093
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:It is well known that single hidden layer feedforward networks with radial basis function (RBF) kernels are universal approximators when all the parameters of the networks are obtained through all kinds of algorithms. However, as observed in most neural network implementations, tuning all the parameters of the network may cause learning complicated, poor generalization, overtraining and unstable. Unlike conventional neural network theories, this brief gives a constructive proof for the fact that a decay RBF neural network with n + 1 hidden neurons can interpolate n + 1 multivariate samples with zero error. Then we prove that the given decay RBFs can uniformly approximate any continuous multivariate functions with arbitrary precision without training. The faster convergence and better generalization performance than conventional RBF algorithm, BP algorithm, extreme learning machine and support vector machines are shown by means of two numerical experiments.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1045-9227
1941-0093
1941-0093
DOI:10.1109/TNN.2010.2055888