ON the influence of parameter theta- on performance of RBF neural networks trained with the dynamic decay adjustment algorithm

The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not hea...

Full description

Saved in:
Bibliographic Details
Published in:International journal of neural systems Vol. 16; no. 4; p. 271
Main Authors: Oliveira, Adriano L I, Medeiros, Ericles A, Rocha, Thyago A B V, Bezerra, Miguel E R, Veras, Ronaldo C
Format: Journal Article
Language:English
Published: Singapore 01.08.2006
Subjects:
ISSN:0129-0657
Online Access:Get more information
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not heavily influence classification performance and therefore they recommended using always the default values of these parameters. In contrast, this paper shows that smaller values of parameter theta(-) can, for a considerable number of datasets, result in strong improvement in generalization performance. The experiments described here were carried out using twenty benchmark classification datasets from both Proben1 and the UCI repositories. The results show that for eleven of the datasets, the parameter theta(-) strongly influenced classification performance. The influence of theta(-) was also noticeable, although much less, on six of the datasets considered. This paper also compares the performance of RBF-DDA with theta(-) selection with both AdaBoost and Support Vector Machines (SVMs).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0129-0657
DOI:10.1142/S0129065706000676