Exploiting neuro-inspired dynamic sparsity for energy-efficient intelligent perception

Artificial intelligence (AI) has made significant strides towards efficient online processing of sensory signals at the edge through the use of deep neural networks with ever-expanding size. However, this trend has brought with it escalating computational costs and energy consumption, which have bec...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Nature communications Ročník 16; číslo 1; s. 9928 - 15
Hlavní autori: Zhou, Sheng, Gao, Chang, Delbruck, Tobi, Verhelst, Marian, Liu, Shih-Chii
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: London Nature Publishing Group UK 11.11.2025
Nature Publishing Group
Nature Portfolio
Predmet:
ISSN:2041-1723, 2041-1723
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Artificial intelligence (AI) has made significant strides towards efficient online processing of sensory signals at the edge through the use of deep neural networks with ever-expanding size. However, this trend has brought with it escalating computational costs and energy consumption, which have become major obstacles to the deployment and further upscaling of these models. In this Perspective, we present a neuro-inspired vision to boost the energy efficiency of AI for perception by leveraging brain-like dynamic sparsity. We categorize various forms of dynamic sparsity rooted in data redundancy and discuss potential strategies to enhance and exploit it through algorithm-hardware co-design. Additionally, we explore the technological, architectural, and algorithmic challenges that need to be addressed to fully unlock the potential of dynamic-sparsity-aware neuro-inspired AI for energy-efficient perception. Edge AI enables intelligent perception in sensory devices, yet at excessive energy costs. This Perspective outlines a neuro-inspired vision for efficient edge perception, sketching the design space of data-driven and stateful dynamic sparsity to selectively activate sensors, memory, and compute.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Review-3
content type line 23
ISSN:2041-1723
2041-1723
DOI:10.1038/s41467-025-65387-7