A modified gradient-based backpropagation training method for neural networks

A improved gradient-based backpropagation training method is proposed for neural networks in this paper. Based on the Barzilai and Borwein steplength update and some technique of Resilient Propagation method, we adapt the new learning rate to improves the speed and the success rate. Experimental res...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2009 IEEE International Conference on Granular Computing s. 450 - 453
Hlavní autoři: Xuewen Mu, Yaling Zhang
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.08.2009
Témata:
ISBN:9781424448302, 1424448301
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:A improved gradient-based backpropagation training method is proposed for neural networks in this paper. Based on the Barzilai and Borwein steplength update and some technique of Resilient Propagation method, we adapt the new learning rate to improves the speed and the success rate. Experimental results show that the proposed method has considerably improved convergence speed, and for the chosen test problems, outperforms other well-known training methods.
ISBN:9781424448302
1424448301
DOI:10.1109/GRC.2009.5255081