A modified gradient-based backpropagation training method for neural networks

A improved gradient-based backpropagation training method is proposed for neural networks in this paper. Based on the Barzilai and Borwein steplength update and some technique of Resilient Propagation method, we adapt the new learning rate to improves the speed and the success rate. Experimental res...

Full description

Saved in:
Bibliographic Details
Published in:2009 IEEE International Conference on Granular Computing pp. 450 - 453
Main Authors: Xuewen Mu, Yaling Zhang
Format: Conference Proceeding
Language:English
Published: IEEE 01.08.2009
Subjects:
ISBN:9781424448302, 1424448301
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A improved gradient-based backpropagation training method is proposed for neural networks in this paper. Based on the Barzilai and Borwein steplength update and some technique of Resilient Propagation method, we adapt the new learning rate to improves the speed and the success rate. Experimental results show that the proposed method has considerably improved convergence speed, and for the chosen test problems, outperforms other well-known training methods.
ISBN:9781424448302
1424448301
DOI:10.1109/GRC.2009.5255081