ON the influence of parameter theta- on performance of RBF neural networks trained with the dynamic decay adjustment algorithm

The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not hea...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:International journal of neural systems Ročník 16; číslo 4; s. 271
Hlavní autori: Oliveira, Adriano L I, Medeiros, Ericles A, Rocha, Thyago A B V, Bezerra, Miguel E R, Veras, Ronaldo C
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Singapore 01.08.2006
Predmet:
ISSN:0129-0657
On-line prístup:Zistit podrobnosti o prístupe
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not heavily influence classification performance and therefore they recommended using always the default values of these parameters. In contrast, this paper shows that smaller values of parameter theta(-) can, for a considerable number of datasets, result in strong improvement in generalization performance. The experiments described here were carried out using twenty benchmark classification datasets from both Proben1 and the UCI repositories. The results show that for eleven of the datasets, the parameter theta(-) strongly influenced classification performance. The influence of theta(-) was also noticeable, although much less, on six of the datasets considered. This paper also compares the performance of RBF-DDA with theta(-) selection with both AdaBoost and Support Vector Machines (SVMs).
AbstractList The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not heavily influence classification performance and therefore they recommended using always the default values of these parameters. In contrast, this paper shows that smaller values of parameter theta(-) can, for a considerable number of datasets, result in strong improvement in generalization performance. The experiments described here were carried out using twenty benchmark classification datasets from both Proben1 and the UCI repositories. The results show that for eleven of the datasets, the parameter theta(-) strongly influenced classification performance. The influence of theta(-) was also noticeable, although much less, on six of the datasets considered. This paper also compares the performance of RBF-DDA with theta(-) selection with both AdaBoost and Support Vector Machines (SVMs).The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not heavily influence classification performance and therefore they recommended using always the default values of these parameters. In contrast, this paper shows that smaller values of parameter theta(-) can, for a considerable number of datasets, result in strong improvement in generalization performance. The experiments described here were carried out using twenty benchmark classification datasets from both Proben1 and the UCI repositories. The results show that for eleven of the datasets, the parameter theta(-) strongly influenced classification performance. The influence of theta(-) was also noticeable, although much less, on six of the datasets considered. This paper also compares the performance of RBF-DDA with theta(-) selection with both AdaBoost and Support Vector Machines (SVMs).
The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks (PNNs). The algorithm has two parameters, namely, theta(+) and theta(-). The papers which introduced DDA argued that those parameters would not heavily influence classification performance and therefore they recommended using always the default values of these parameters. In contrast, this paper shows that smaller values of parameter theta(-) can, for a considerable number of datasets, result in strong improvement in generalization performance. The experiments described here were carried out using twenty benchmark classification datasets from both Proben1 and the UCI repositories. The results show that for eleven of the datasets, the parameter theta(-) strongly influenced classification performance. The influence of theta(-) was also noticeable, although much less, on six of the datasets considered. This paper also compares the performance of RBF-DDA with theta(-) selection with both AdaBoost and Support Vector Machines (SVMs).
Author Bezerra, Miguel E R
Medeiros, Ericles A
Oliveira, Adriano L I
Rocha, Thyago A B V
Veras, Ronaldo C
Author_xml – sequence: 1
  givenname: Adriano L I
  surname: Oliveira
  fullname: Oliveira, Adriano L I
  email: adriano@dsc.upe.br
  organization: Department of Computing Systems, Polytechnic School of Engineering, Pernambuco State University, Rua Benfica, 455, Madalena, Recife - PE, Brazil. adriano@dsc.upe.br
– sequence: 2
  givenname: Ericles A
  surname: Medeiros
  fullname: Medeiros, Ericles A
– sequence: 3
  givenname: Thyago A B V
  surname: Rocha
  fullname: Rocha, Thyago A B V
– sequence: 4
  givenname: Miguel E R
  surname: Bezerra
  fullname: Bezerra, Miguel E R
– sequence: 5
  givenname: Ronaldo C
  surname: Veras
  fullname: Veras, Ronaldo C
BackLink https://www.ncbi.nlm.nih.gov/pubmed/16972315$$D View this record in MEDLINE/PubMed
BookMark eNo1kDtPwzAUhT0U0Qf8ABbkiS1gu47jjFBRQKqoxGOOHOeGpsR2sB1VXfjtpFCmI5376ZPOnaKRdRYQuqDkmlLObl4JZTkRaUYEIURkYoQmhyo5dGM0DWFLCOUZl6doTEWesTlNJ-h7_YzjBnBj67YHqwG7GnfKKwMR_OEUVYKdxR342nmjjsjL3RJb6L1qh4g75z8Djl41Fiq8a-LmV1rtrTKNxhVotceq2vYhGrARq_bD-YEyZ-ikVm2A82PO0Pvy_m3xmKzWD0-L21Wi55yIpCaQZbXmLM0VlZKSkg_Tair5HDRLS00gFaJkAEqWQFMmOANOylTILB9oNkNXf97Ou68eQixMEzS0rbLg-lAIKfmgzQfw8gj2pYGq6HxjlN8X_x9jP9eSbmw
CitedBy_id crossref_primary_10_1007_s00362_015_0694_y
crossref_primary_10_1016_j_neucom_2007_11_003
ContentType Journal Article
DBID CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1142/S0129065706000676
DatabaseName Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
MEDLINE
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod no_fulltext_linktorsrc
Discipline Computer Science
ExternalDocumentID 16972315
Genre Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID ---
.DC
0R~
36B
4.4
53G
5GY
ADSJI
AENEX
ALMA_UNASSIGNED_HOLDINGS
CAG
CGR
COF
CS3
CUY
CVF
DU5
EBS
ECM
EIF
EJD
EMOBN
ESX
F5P
HZ~
NPM
O9-
P2P
P71
RWJ
WSC
7X8
ID FETCH-LOGICAL-c3406-f0e77fc4259a18810b4129f1843ec25bc0e566b2eea8be152642e40b5687910b2
IEDL.DBID 7X8
ISICitedReferencesCount 7
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000240415500004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0129-0657
IngestDate Fri Jul 11 09:50:58 EDT 2025
Sat Sep 28 07:52:14 EDT 2024
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c3406-f0e77fc4259a18810b4129f1843ec25bc0e566b2eea8be152642e40b5687910b2
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PMID 16972315
PQID 68848109
PQPubID 23479
ParticipantIDs proquest_miscellaneous_68848109
pubmed_primary_16972315
PublicationCentury 2000
PublicationDate 2006-Aug
20060801
PublicationDateYYYYMMDD 2006-08-01
PublicationDate_xml – month: 08
  year: 2006
  text: 2006-Aug
PublicationDecade 2000
PublicationPlace Singapore
PublicationPlace_xml – name: Singapore
PublicationTitle International journal of neural systems
PublicationTitleAlternate Int J Neural Syst
PublicationYear 2006
SSID ssj0014748
Score 1.7414981
Snippet The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks (RBFNs) and probabilistic neural networks...
SourceID proquest
pubmed
SourceType Aggregation Database
Index Database
StartPage 271
SubjectTerms Algorithms
Artificial Intelligence
Models, Statistical
Neural Networks (Computer)
Nonlinear Dynamics
Time Factors
Title ON the influence of parameter theta- on performance of RBF neural networks trained with the dynamic decay adjustment algorithm
URI https://www.ncbi.nlm.nih.gov/pubmed/16972315
https://www.proquest.com/docview/68848109
Volume 16
WOSCitedRecordID wos000240415500004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV07T8MwELYKZWChvClPD6xWkza1EwkJAaJigFDxUrfKTi5QRJNCWiQWfjt3TqJOiIElS-LEOd_Zn-_O9zF2TKEvRweB8LtaC0-pjjBuOxZJLJ0oQgzrWdfA07UKQ38wCPo1dlKdhaG0ympOtBN1nEXkI29Jnwq_O8Hp5F0QZxTFVksCjQVW7yCQIbNUg3kMwVOWO4s8LQIXWlXGNF2v3bq37hfK-pB2vpa_40u7zvQa_-vhKlsp8SU_KxRijdUgXWeNiruBl6a8wb5vQ47Yj48qkhKeJZzqgI8pP4ZuTbXgWcon85MF9MjdeY9TDUz8RlpkkOfc0kxAzMmna18aFzT3PIZIf3Edv85ym8zO9dsz9nn6Mt5kj73Lh4srUXIxiKiDa75IHFAqidDCA-36-JvGQxEmxBYDOKQmcgCBoWkDaN8AggLc14DnmK70FWqDaW-xxTRLYYdxRIwgtYQg8TS2lgZi0EA4RxqFaLLJjir5DlHXKYChU8hm-bCScJNtF0M0nBQlOYauJPY0t7v7Z9s9tlx4USiHb5_VE7RyOGBL0ed0lH8cWhXCa9i_-QHEjdDv
linkProvider ProQuest
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ON+the+influence+of+parameter+theta-+on+performance+of+RBF+neural+networks+trained+with+the+dynamic+decay+adjustment+algorithm&rft.jtitle=International+journal+of+neural+systems&rft.au=Oliveira%2C+Adriano+L+I&rft.au=Medeiros%2C+Ericles+A&rft.au=Rocha%2C+Thyago+A+B+V&rft.au=Bezerra%2C+Miguel+E+R&rft.date=2006-08-01&rft.issn=0129-0657&rft.volume=16&rft.issue=4&rft.spage=271&rft_id=info:doi/10.1142%2FS0129065706000676&rft_id=info%3Apmid%2F16972315&rft_id=info%3Apmid%2F16972315&rft.externalDocID=16972315
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0129-0657&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0129-0657&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0129-0657&client=summon