An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity

We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger m...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) Vol. 237; pp. 193 - 199
Main Authors: Negrov, D., Karandashev, I., Shakirov, V., Matveyev, Yu, Dunin-Barkowski, W., Zenkevich, A.
Format: Journal Article
Language:English
Published: Elsevier B.V 10.05.2017
Subjects:
ISSN:0925-2312, 1872-8286
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger multiple positive or negative updates of the synaptic weight, and to use the min operation instead of the product of the two signals. In computational simulations, we show that the proposed approximation to backpropagation is well converged and may be suitable for memristor implementations of multilayer neural networks.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2016.10.061