UniCoS: A Unified Neural and Accelerator Co-Search Framework for CNNs and ViTs
Current algorithm-hardware co-search works often suffer from lengthy training times and inadequate exploration of hardware design spaces, leading to suboptimal performance. This work introduces UniCoS, a unified framework for co-optimizing neural networks and accelerators for CNNs and Vision Transfo...
Gespeichert in:
| Veröffentlicht in: | 2025 62nd ACM/IEEE Design Automation Conference (DAC) S. 1 - 6 |
|---|---|
| Hauptverfasser: | , , , , , , , |
| Format: | Tagungsbericht |
| Sprache: | Englisch |
| Veröffentlicht: |
IEEE
22.06.2025
|
| Schlagworte: | |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | Current algorithm-hardware co-search works often suffer from lengthy training times and inadequate exploration of hardware design spaces, leading to suboptimal performance. This work introduces UniCoS, a unified framework for co-optimizing neural networks and accelerators for CNNs and Vision Transformers (ViTs). By introducing a novel training-free proxy that evaluates accuracy within seconds and a clustering-based algorithm for exploring heterogeneous dataflows, UniCoS efficiently navigates the design spaces of both architectures. Experimental results demonstrate that the solutions generated by UniCoS consistently surpass state-of-the-art (SOTA) methods (e.g., 3.54 \times energy-delay product (EDP) improvement with a 1.76 \% higher accuracy on ImageNet) while requiring notably reduced search time (up to 48 \times, \sim 3 hours). The code is available at https://github.com/mine7777/Unicos.git. |
|---|---|
| DOI: | 10.1109/DAC63849.2025.11133418 |