An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity

We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger m...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Neurocomputing (Amsterdam) Ročník 237; s. 193 - 199
Hlavní autori: Negrov, D., Karandashev, I., Shakirov, V., Matveyev, Yu, Dunin-Barkowski, W., Zenkevich, A.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier B.V 10.05.2017
Predmet:
ISSN:0925-2312, 1872-8286
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger multiple positive or negative updates of the synaptic weight, and to use the min operation instead of the product of the two signals. In computational simulations, we show that the proposed approximation to backpropagation is well converged and may be suitable for memristor implementations of multilayer neural networks.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2016.10.061