Exploiting neuro-inspired dynamic sparsity for energy-efficient intelligent perception

Artificial intelligence (AI) has made significant strides towards efficient online processing of sensory signals at the edge through the use of deep neural networks with ever-expanding size. However, this trend has brought with it escalating computational costs and energy consumption, which have bec...

Full description

Saved in:
Bibliographic Details
Published in:Nature communications Vol. 16; no. 1; pp. 9928 - 15
Main Authors: Zhou, Sheng, Gao, Chang, Delbruck, Tobi, Verhelst, Marian, Liu, Shih-Chii
Format: Journal Article
Language:English
Published: London Nature Publishing Group UK 11.11.2025
Nature Publishing Group
Nature Portfolio
Subjects:
ISSN:2041-1723, 2041-1723
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Artificial intelligence (AI) has made significant strides towards efficient online processing of sensory signals at the edge through the use of deep neural networks with ever-expanding size. However, this trend has brought with it escalating computational costs and energy consumption, which have become major obstacles to the deployment and further upscaling of these models. In this Perspective, we present a neuro-inspired vision to boost the energy efficiency of AI for perception by leveraging brain-like dynamic sparsity. We categorize various forms of dynamic sparsity rooted in data redundancy and discuss potential strategies to enhance and exploit it through algorithm-hardware co-design. Additionally, we explore the technological, architectural, and algorithmic challenges that need to be addressed to fully unlock the potential of dynamic-sparsity-aware neuro-inspired AI for energy-efficient perception. Edge AI enables intelligent perception in sensory devices, yet at excessive energy costs. This Perspective outlines a neuro-inspired vision for efficient edge perception, sketching the design space of data-driven and stateful dynamic sparsity to selectively activate sensors, memory, and compute.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Review-3
content type line 23
ISSN:2041-1723
2041-1723
DOI:10.1038/s41467-025-65387-7