Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Discrete Dynamics in Nature and Society Jg. 2009; H. 1; S. 539 - 554
Hauptverfasser: Zhang, Huisheng, Zhang, Chao, Wu, Wei
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York Hindawi Limiteds 01.01.2009
Hindawi Publishing Corporation
John Wiley & Sons, Inc
Wiley
Schlagworte:
ISSN:1026-0226, 1607-887X
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1026-0226
1607-887X
DOI:10.1155/2009/329173