SPT-Code: Sequence-to-Sequence Pre-Training for Learning Source Code Representations
Recent years have seen the successful application of large pretrained models to code representation learning, resulting in substantial improvements on many code-related downstream tasks. But there are issues surrounding their application to SE tasks. First, the majority of the pre-trained models foc...
Saved in:
| Published in: | 2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE) pp. 01 - 13 |
|---|---|
| Main Authors: | , , , , , |
| Format: | Conference Proceeding |
| Language: | English |
| Published: |
ACM
01.05.2022
|
| Subjects: | |
| ISSN: | 1558-1225 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Be the first to leave a comment!