Exploiting neuro-inspired dynamic sparsity for energy-efficient intelligent perception
Artificial intelligence (AI) has made significant strides towards efficient online processing of sensory signals at the edge through the use of deep neural networks with ever-expanding size. However, this trend has brought with it escalating computational costs and energy consumption, which have bec...
Uloženo v:
| Vydáno v: | Nature communications Ročník 16; číslo 1; s. 9928 - 15 |
|---|---|
| Hlavní autoři: | , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
London
Nature Publishing Group UK
11.11.2025
Nature Publishing Group Nature Portfolio |
| Témata: | |
| ISSN: | 2041-1723, 2041-1723 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Artificial intelligence (AI) has made significant strides towards efficient online processing of sensory signals at the edge through the use of deep neural networks with ever-expanding size. However, this trend has brought with it escalating computational costs and energy consumption, which have become major obstacles to the deployment and further upscaling of these models. In this Perspective, we present a neuro-inspired vision to boost the energy efficiency of AI for perception by leveraging brain-like dynamic sparsity. We categorize various forms of dynamic sparsity rooted in data redundancy and discuss potential strategies to enhance and exploit it through algorithm-hardware co-design. Additionally, we explore the technological, architectural, and algorithmic challenges that need to be addressed to fully unlock the potential of dynamic-sparsity-aware neuro-inspired AI for energy-efficient perception.
Edge AI enables intelligent perception in sensory devices, yet at excessive energy costs. This Perspective outlines a neuro-inspired vision for efficient edge perception, sketching the design space of data-driven and stateful dynamic sparsity to selectively activate sensors, memory, and compute. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 ObjectType-Review-3 content type line 23 |
| ISSN: | 2041-1723 2041-1723 |
| DOI: | 10.1038/s41467-025-65387-7 |