Neural Acceleration for General-Purpose Approximate Programs

This paper describes a learning-based approach to the acceleration of approximate programs. We describe the \emph{Parrot transformation}, a program transformation that selects and trains a neural network to mimic a region of imperative code. After the learning phase, the compiler replaces the origin...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2012 45th Annual IEEE/ACM International Symposium on Microarchitecture s. 449 - 460
Hlavní autoři: Esmaeilzadeh, H., Sampson, A., Ceze, L., Burger, D.
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.12.2012
Témata:
ISSN:1072-4451
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This paper describes a learning-based approach to the acceleration of approximate programs. We describe the \emph{Parrot transformation}, a program transformation that selects and trains a neural network to mimic a region of imperative code. After the learning phase, the compiler replaces the original code with an invocation of a low-power accelerator called a \emph{neural processing unit} (NPU). The NPU is tightly coupled to the processor pipeline to accelerate small code regions. Since neural networks produce inherently approximate results, we define a programming model that allows programmers to identify approximable code regions -- code that can produce imprecise but acceptable results. Offloading approximable code regions to NPUs is faster and more energy efficient than executing the original code. For a set of diverse applications, NPU acceleration provides whole-application speedup of 2.3× and energy savings of 3.0× on average with quality loss of at most 9.6%.
ISSN:1072-4451
DOI:10.1109/MICRO.2012.48