Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Discrete Dynamics in Nature and Society Ročník 2009; číslo 1; s. 539 - 554
Hlavní autori: Zhang, Huisheng, Zhang, Chao, Wu, Wei
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York Hindawi Limiteds 01.01.2009
Hindawi Publishing Corporation
John Wiley & Sons, Inc
Wiley
Predmet:
ISSN:1026-0226, 1607-887X
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1026-0226
1607-887X
DOI:10.1155/2009/329173