Bibliographische Detailangaben
| Titel: |
Improving code completion efficiency through grouped attention. |
| Autoren: |
Yin, Yiming, Liu, Jianxun, Liu, Yi, Deng, Jia |
| Quelle: |
Computer Journal; Feb2026, Vol. 69 Issue 2, p320-331, 12p |
| Schlagwörter: |
TRANSFORMER models, COMPUTATIONAL complexity, JAVASCRIPT programming language, PYTHON programming language, MACHINE learning |
| Abstract: |
Although the code completion model based on the Transformer architecture has achieved remarkable results, the computational complexity of its multi-head self-attention grows quadratically with the increase of sequence length, resulting in low efficiency. To this end, we propose a code completion model based on grouped attention, referred to as GACC. This method groups the attention heads of a representation in the query, key, or value so that the attention heads in the same group share the single attention head of other representations, thereby reducing the number of parameters and computational complexity. We conducted experiments on public datasets of Python and JavaScript, and the experimental results show that compared to models based on multi-head self-attention, GACC effectively reduces the time taken for the model to suggest the next code token while achieving comparable performance on Top-k and MRR metrics. [ABSTRACT FROM AUTHOR] |
|
Copyright of Computer Journal is the property of Oxford University Press / USA and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.) |
| Datenbank: |
Complementary Index |