Accelerated gradient algorithm for RBF neural network

Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, one of the challenges in the training process is determining how to avoid vanishing gradient. To solve this problem, an accelerated gradient algorithm (AGA) is designed to improve the lear...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) Vol. 441; pp. 237 - 247
Main Authors: Han, Hong-Gui, Ma, Miao-Li, Qiao, Jun-Fei
Format: Journal Article
Language:English
Published: Elsevier B.V 21.06.2021
Subjects:
ISSN:0925-2312, 1872-8286
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, one of the challenges in the training process is determining how to avoid vanishing gradient. To solve this problem, an accelerated gradient algorithm (AGA) is designed to improve the learning performance of RBFNN in this paper. First, an indirect detection mechanism, based on the instantaneous gradient decay rate (IGDR) and instantaneous convergence rate (ICR), is developed to identify the vanishing gradient in learning process. Second, an amplification gradient strategy (AGS), which can increase the gradient value of learning parameters, is designed to accelerate the learning speed of RBFNN. Third, the analysis of AGA-based RBFNN (AGA-RBFNN) is given to guarantee the successful application. Finally, some benchmark and real problems are used to illustrate the effectiveness of AGA-RBFNN. The results demonstrate the effectiveness of AGA-RBFNN.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2021.02.009