SPT-Code: Sequence-to-Sequence Pre-Training for Learning Source Code Representations
Recent years have seen the successful application of large pretrained models to code representation learning, resulting in substantial improvements on many code-related downstream tasks. But there are issues surrounding their application to SE tasks. First, the majority of the pre-trained models foc...
Gespeichert in:
| Veröffentlicht in: | 2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE) S. 01 - 13 |
|---|---|
| Hauptverfasser: | , , , , , |
| Format: | Tagungsbericht |
| Sprache: | Englisch |
| Veröffentlicht: |
ACM
01.05.2022
|
| Schlagworte: | |
| ISSN: | 1558-1225 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Schreiben Sie den ersten Kommentar!