A Tensor Algebra Compiler for Sparse Differentiation

Sparse tensors are prevalent in many data-intensive applications. However, existing automatic differentiation (AD) frameworks are tailored towards dense tensors, which makes it a challenge to efficiently compute gradients through sparse tensor operations. This is due to irregular sparsity patterns t...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Proceedings / International Symposium on Code Generation and Optimization s. 1 - 12
Hlavní autoři: Shaikhha, Amir, Huot, Mathieu, Hashemian, Shideh
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 02.03.2024
Témata:
ISSN:2643-2838
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Sparse tensors are prevalent in many data-intensive applications. However, existing automatic differentiation (AD) frameworks are tailored towards dense tensors, which makes it a challenge to efficiently compute gradients through sparse tensor operations. This is due to irregular sparsity patterns that can result in substantial memory and computational overheads. We propose a novel framework that enables the efficient AD of sparse tensors. The key aspects of our work include a compilation pipeline leveraging two intermediate DSLs with AD-agnostic domain-specific optimizations followed by efficient C++ code generation. We showcase the effectiveness of our framework in terms of performance and scalability through extensive experimentation, outperforming state-of-the-art alternatives across a variety of synthetic and real-world datasets.
ISSN:2643-2838
DOI:10.1109/CGO57630.2024.10444787