Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...

Full description

Saved in:
Bibliographic Details
Published in:Discrete Dynamics in Nature and Society Vol. 2009; no. 1; pp. 539 - 554
Main Authors: Zhang, Huisheng, Zhang, Chao, Wu, Wei
Format: Journal Article
Language:English
Published: New York Hindawi Limiteds 01.01.2009
Hindawi Publishing Corporation
John Wiley & Sons, Inc
Wiley
Subjects:
ISSN:1026-0226, 1607-887X
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1026-0226
1607-887X
DOI:10.1155/2009/329173