Smart Prompt Advisor: Multi-Objective Prompt Framework for Consistency and Best Practices
Recent breakthroughs in Large Language Models (LLM), comprised of billions of parameters, have achieved the ability to unveil exceptional insight into a wide range of Natural Language Processing (NLP) tasks. The onus of the performance of these models lies in the sophistication and completeness of t...
Uloženo v:
| Vydáno v: | IEEE/ACM International Conference on Automated Software Engineering : [proceedings] s. 1846 - 1848 |
|---|---|
| Hlavní autoři: | , , , , , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
11.09.2023
|
| Témata: | |
| ISSN: | 2643-1572 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Recent breakthroughs in Large Language Models (LLM), comprised of billions of parameters, have achieved the ability to unveil exceptional insight into a wide range of Natural Language Processing (NLP) tasks. The onus of the performance of these models lies in the sophistication and completeness of the input prompt. Minimizing the enhancement cycles of prompt with improvised keywords becomes critically important as it directly affects the time to market and cost of the developing solution. However, this process inevitably has a trade-off between the learning curve/proficiency of the user and completeness of the prompt, as generating such a solutions is an incremental process. In this paper, we have designed a novel solution and implemented it in the form of a plugin for Visual Studio Code IDE, which can optimize this trade-off, by learning the underlying prompt intent to enhance with keywords. This will tend to align with developers' collection of semantics while developing a secure code, ensuring parameter and local variable names, return expressions, simple pre and post-conditions. and basic control and data flow are met. |
|---|---|
| ISSN: | 2643-1572 |
| DOI: | 10.1109/ASE56229.2023.00019 |