Backpropagation Algorithms for a Broad Class of Dynamic Networks

This paper introduces a general framework for describing dynamic neural networks-the layered digital dynamic network (LDDN). This framework allows the development of two general algorithms for computing the gradients and Jacobians for these dynamic networks: backpropagation-through-time (BPTT) and r...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on neural networks Vol. 18; no. 1; pp. 14 - 27
Main Authors: Orlando De Jesus, Hagan, M.T.
Format: Journal Article
Language:English
Published: New York, NY IEEE 01.01.2007
Institute of Electrical and Electronics Engineers
Subjects:
ISSN:1045-9227
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper introduces a general framework for describing dynamic neural networks-the layered digital dynamic network (LDDN). This framework allows the development of two general algorithms for computing the gradients and Jacobians for these dynamic networks: backpropagation-through-time (BPTT) and real-time recurrent learning (RTRL). The structure of the LDDN framework enables an efficient implementation of both algorithms for arbitrary dynamic networks. This paper demonstrates that the BPTT algorithm is more efficient for gradient calculations, but the RTRL algorithm is more efficient for Jacobian calculations
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1045-9227
DOI:10.1109/TNN.2006.882371