An Efficient Training Accelerator for Transformers With Hardware-Algorithm Co-Optimization

Transformers have achieved significant success in deep learning, and training Transformers efficiently on resource-constrained platforms has been attracting continuous attention for domain adaptions and privacy concerns. However, deploying Transformers training on these platforms is still challengin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on very large scale integration (VLSI) systems Jg. 31; H. 11; S. 1788 - 1801
Hauptverfasser: Shao, Haikuo, Lu, Jinming, Wang, Meiqi, Wang, Zhongfeng
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York IEEE 01.11.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1063-8210, 1557-9999
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!