A Memristive Spiking Neural Network Circuit with Selective Supervised Attention Algorithm
Spiking neural networks (SNNs) are biologically plausible and computationally powerful. The current computing systems based on the von Neumann architecture are almost the hardware basis for the implementation of SNNs. However, performance bottlenecks in computing speed, cost, and energy consumption...
Saved in:
| Published in: | IEEE transactions on computer-aided design of integrated circuits and systems Vol. 42; no. 8; p. 1 |
|---|---|
| Main Authors: | , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York
IEEE
01.08.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 0278-0070, 1937-4151 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Spiking neural networks (SNNs) are biologically plausible and computationally powerful. The current computing systems based on the von Neumann architecture are almost the hardware basis for the implementation of SNNs. However, performance bottlenecks in computing speed, cost, and energy consumption hinder the hardware development of SNNs. Therefore, efficient non-Neumann hardware computing systems for SNNs remain to be explored. In this paper, a selective supervised algorithm for spiking neurons inspired by the selective attention mechanism is proposed, and a memristive spiking neuron circuit as well as a memristive SNN circuit based on the proposed algorithm are designed. The memristor realizes the learning and memory of the synaptic weight. The proposed algorithm includes a top-down selective supervision method and a bottom-up selective supervision method. Compared with other supervised algorithms, the proposed algorithm has excellent performance on sequence learning. Moreover, top-down and bottom-up attention encoding circuits are designed to provide the hardware foundation for encoding external stimuli into top-down and bottom-up attention spikes, respectively. The proposed memristive SNN circuit can perform classification on the MNIST dataset and the Fashion-MNIST dataset with superior accuracy after learning a small number of labeled samples, which greatly reduces the cost of manual annotation and improves the supervised learning efficiency of the memristive SNN circuit. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0278-0070 1937-4151 |
| DOI: | 10.1109/TCAD.2022.3228896 |