Superior performance of using hyperbolic sine activation functions in ZNN illustrated via time-varying matrix square roots finding

A special class of recurrent neural network (RNN), termed Zhang neural network (ZNN) depicted in the implicit dynamics, has recently been proposed for online solution of time-varying matrix square roots. Such a ZNN model can be constructed by using monotonically-increasing odd activation functions t...

Full description

Saved in:
Bibliographic Details
Published in:Computer Science and Information Systems Vol. 9; no. 4; pp. 1603 - 1625
Main Authors: Zhang, Yunong, Jin, Long, Ke, Zhende
Format: Journal Article
Language:English
Published: 2012
ISSN:1820-0214, 2406-1018
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A special class of recurrent neural network (RNN), termed Zhang neural network (ZNN) depicted in the implicit dynamics, has recently been proposed for online solution of time-varying matrix square roots. Such a ZNN model can be constructed by using monotonically-increasing odd activation functions to obtain the theoretical time-varying matrix square roots in an error-free manner. Different choices of activation function arrays may lead to different performance of the ZNN model. Generally speaking, ZNN model using hyperbolic sine activation functions may achieve better performance, as compared with those using other activation functions. In this paper, to pursue the superior convergence and robustness properties, hyperbolic sine activation functions are applied to the ZNN model for online solution of time-varying matrix square roots. Theoretical analysis and computer-simulation results further demonstrate the superior performance of the ZNN model using hyperbolic sine activation functions in the context of large model-implementation errors, in comparison with that using linear activation functions. nema
ISSN:1820-0214
2406-1018
DOI:10.2298/CSIS120121043Z