Bibliographic Details
| Title: |
Enhancing Text-to-SQL Capabilities of Large Language Models by intermediate representations and Chain of Thought. |
| Authors: |
Tingxuan Fu, Yu Fu, Haoran Li, Sijia Hao, Qiming Chen |
| Source: |
International Journal of Multiphysics; 2024, Vol. 18 Issue 4, p345-357, 13p |
| Subject Terms: |
LANGUAGE models, SQL, NATURAL languages, DILEMMA |
| Abstract: |
While large language models with in-context learning have dramatically improved the performance of text-to-SQL tasks, the semantic gay between natural language and SQL queries has not yet been bridged. Although some intermediate representations are designed to reduce the difficulty of SQL query generation, the dilemma of problem decomposition is not effectively alleviated in complex scenarios. Our proposed solution is intended to address both of these issues. First of all, we use NatSQL as the intermediate representation to implement the task of Text-to-NatSQL. Secondly, we use samples with Chain-of-Thought information to fine-tune small and medium-scale LLMs to enhance their task decomposition and reasoning capabilities in complex scenarios. Experiment results demonstrate that our model achieves performance similar to or better than several competitive baselines on public datasets Spider. [ABSTRACT FROM AUTHOR] |
|
Copyright of International Journal of Multiphysics is the property of MULTIPHYSICS and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.) |
| Database: |
Complementary Index |