Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines

An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and l...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Frontiers in neuroscience Ročník 11; s. 324
Hlavní autoři: Neftci, Emre O., Augustine, Charles, Paul, Somnath, Detorakis, Georgios
Médium: Journal Article
Jazyk:angličtina
Vydáno: Switzerland Frontiers Research Foundation 21.06.2017
Frontiers Media S.A
Témata:
ISSN:1662-453X, 1662-4548, 1662-453X
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. However, the workhorse of deep learning, the gradient descent Gradient Back Propagation (BP) rule, often relies on the immediate availability of network-wide information stored with high-precision memory during learning, and precise operations that are difficult to realize in neuromorphic hardware. Remarkably, recent work showed that exact backpropagated gradients are not essential for learning deep representations. Building on these results, we demonstrate an event-driven random BP (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations. Using a two-compartment Leaky Integrate & Fire (I&F) neuron, the rule requires only one addition and two comparisons for each synaptic weight, making it very suitable for implementation in digital or mixed-signal neuromorphic hardware. Our results show that using eRBP, deep representations are rapidly learned, achieving classification accuracies on permutation invariant datasets comparable to those obtained in artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
This article was submitted to Neuromorphic Engineering, a section of the journal Frontiers in Neuroscience
Edited by: André van Schaik, Western Sydney University, Australia
Reviewed by: Mark D. McDonnell, University of South Australia, Australia; Michael Pfeiffer, Robert Bosch (Germany), Germany
ISSN:1662-453X
1662-4548
1662-453X
DOI:10.3389/fnins.2017.00324