Native DRAM Cache: Re-architecting DRAM as a Large-Scale Cache for Data Centers

Contemporary data center CPUs are experiencing an unprecedented surge in core count. This trend necessitates scrutinized Last-Level Cache (LLC) strategies to accommodate increasing capacity demands. While DRAM offers significant capacity, using it as a cache poses challenges related to latency and e...

Full description

Saved in:
Bibliographic Details
Published in:2024 ACM/IEEE 51st Annual International Symposium on Computer Architecture (ISCA) pp. 1144 - 1156
Main Authors: Ryu, Yesin, Kim, Yoojin, Jung, Giyong, Ahn, Jung Ho, Kim, Jungrae
Format: Conference Proceeding
Language:English
Published: IEEE 29.06.2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Contemporary data center CPUs are experiencing an unprecedented surge in core count. This trend necessitates scrutinized Last-Level Cache (LLC) strategies to accommodate increasing capacity demands. While DRAM offers significant capacity, using it as a cache poses challenges related to latency and energy. This paper introduces Native DRAM Cache (NDC), a novel DRAM architecture specifically designed to operate as a cache. NDC features innovative approaches, such as conducting tag matching and way selection within a DRAM subarray and repurposing existing precharge transistors for tag matching. These innovations facilitate Caching-In-Memory (CIM) and enable NDC to serve as a high-capacity LLC with high set-associativity, low-latency, high-throughput, and low-energy. Our evaluation demonstrates that NDC significantly outperforms state-of-the-art DRAM cache solutions, enhancing performance by \mathbf{2.8 \%} / \mathbf{52.5 \%} / \mathbf{44.2 \%} (up to 8.4 \% / 140.6 \% / 85.5 \%) in SPEC/NPB/GAP benchmark suites, respectively.
DOI:10.1109/ISCA59077.2024.00086