A modified gradient-based backpropagation training method for neural networks

A improved gradient-based backpropagation training method is proposed for neural networks in this paper. Based on the Barzilai and Borwein steplength update and some technique of Resilient Propagation method, we adapt the new learning rate to improves the speed and the success rate. Experimental res...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:2009 IEEE International Conference on Granular Computing s. 450 - 453
Hlavní autori: Xuewen Mu, Yaling Zhang
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 01.08.2009
Predmet:
ISBN:9781424448302, 1424448301
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:A improved gradient-based backpropagation training method is proposed for neural networks in this paper. Based on the Barzilai and Borwein steplength update and some technique of Resilient Propagation method, we adapt the new learning rate to improves the speed and the success rate. Experimental results show that the proposed method has considerably improved convergence speed, and for the chosen test problems, outperforms other well-known training methods.
ISBN:9781424448302
1424448301
DOI:10.1109/GRC.2009.5255081