SPT-Code: Sequence-to-Sequence Pre-Training for Learning Source Code Representations

Recent years have seen the successful application of large pretrained models to code representation learning, resulting in substantial improvements on many code-related downstream tasks. But there are issues surrounding their application to SE tasks. First, the majority of the pre-trained models foc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE) S. 01 - 13
Hauptverfasser: Niu, Changan, Li, Chuanyi, Ng, Vincent, Ge, Jidong, Huang, Liguo, Luo, Bin
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: ACM 01.05.2022
Schlagworte:
ISSN:1558-1225
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!