An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity

We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger m...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Neurocomputing (Amsterdam) Ročník 237; s. 193 - 199
Hlavní autoři: Negrov, D., Karandashev, I., Shakirov, V., Matveyev, Yu, Dunin-Barkowski, W., Zenkevich, A.
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier B.V 10.05.2017
Témata:
ISSN:0925-2312, 1872-8286
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger multiple positive or negative updates of the synaptic weight, and to use the min operation instead of the product of the two signals. In computational simulations, we show that the proposed approximation to backpropagation is well converged and may be suitable for memristor implementations of multilayer neural networks.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2016.10.061