Search Results - "SQL(Structured Query Language)"

Refine Results
  1. 1
  2. 2

    File Description: application/pdf

    Relation: Conferinţa tehnico-ştiinţifică a studenţilor, masteranzilor şi doctoranzilor = Technical Scientific Conference of Undergraduate, Master and PhD Students: Chişinău, 27-29 martie 2024. Vol. 1; http://repository.utm.md/handle/5014/27928

  3. 3
  4. 4

    Authors: Castraveț, A.

    Source: Conferinţa tehnico-ştiinţifică a studenţilor, masteranzilor şi doctoranzilor (Vol.1)

    File Description: application/pdf

  5. 5

    Source: RECIMA21 - Revista Científica Multidisciplinar - ISSN 2675-6218; Vol. 4 No. 11 (2023): CLICK HERE TO ACCESS THE ARTICLES; e4114455 ; RECIMA21 - Revista Científica Multidisciplinar - ISSN 2675-6218; Vol. 4 Núm. 11 (2023): HAGA CLIC AQUÍ PARA ACCEDER A LOS ARTÍCULOS; e4114455 ; RECIMA21 - Revista Científica Multidisciplinar - ISSN 2675-6218; v. 4 n. 11 (2023): CLIQUE AQUI PARA ACESSAR OS ARTIGOS; e4114455 ; RECIMA21 -Revista Científica Multidisciplinar - ISSN 2675-6218; Vol. 4 N.º 11 (2023): CLIQUE AQUI PARA ACESSAR OS ARTIGOS; e4114455 ; 2675-6218

    File Description: application/pdf

  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13

    File Description: application/pdf

    Relation: Obeng, A., Zhong, J. C., & Gu, C. (2024). How we built text-to-SQL at Pinterest. Pinterest Engineering Blog@Medium. https://medium.com/pinterest-engineering/how-we-built-text-to-sql-at-pinterest-30bad30dabff; Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. arXiv preprint arXiv:1706.03762. https://arxiv.org/abs/1706.03762; Amidi, A., & Amidi, S. (2024). Super study guide: Transformers & large language models. Afshine Amidi and Shervine Amidi.; Bouchard, L.-F., & Peters, L. (2024). Building LLMs for production: Enhancing LLM abilities and reliability with prompting, fine-tuning, and RAG. Towards AI.; Huang, L., Yu, W., Ma, W., Zhong, W., Feng, Z., Wang, H., Chen, Q., Peng, W., Feng, X., Qin, B., & Liu, T. (2023). A survey on hallucination in large language models: Principles, taxonomy, challenges, and open questions. arXiv preprint arXiv:2311.05232. https://arxiv.org/abs/2311.05232; Lewis, P., Perez, E., Piktus, A., Petroni, F., Karpukhin, V., Goyal, N., Küttler, H., Lewis, M., Yih, W.-t., Rocktäschel, T., Riedel, S., & Kiela, D. (2021). Retrieval-augmented generation for knowledge-intensive NLP tasks. arXiv preprint arXiv:2005.11401. https://arxiv.org/abs/2005.11401; Nguyen, Z., Annunziata, A., Luong, V., Dinh, S., Le, Q., Ha, A. H., Le, C., Phan, H. A., Raghavan, S., & Nguyen, C. (2024). Enhancing Q&A with domain-specific fine-tuning and iterative reasoning: A comparative study. arXiv preprint arXiv:2404.11792. https://arxiv.org/abs/2404.11792; Auffarth, B. (2023). Generative AI with LangChain: Build large language model (LLM) apps with Python, ChatGPT, and other LLMs. Packt Publishing.; Naveed, H., Khan, A. U., Qiu, S., Saqib, M., Anwar, S., Usman, M., Akhtar, N., Barnes, N., & Mian, A. (2024). A comprehensive overview of large language models. arXiv preprint arXiv:2307.06435. https://arxiv.org/abs/2307.06435; Alto, V. (2024). Building LLM powered applications: Create intelligent apps and agents with large language models. Packt Publishing.; Li, J., Hui, B., Qu, G., Yang, J., Li, B., Li, B., Wang, B., Qin, B., Geng, R., Huo, N., … et al. (2024). Can LLM already serve as a database interface? A big bench for large-scale database grounded text-to-sqls. Advances in Neural Information Processing Systems, 36. https://arxiv.org/pdf/2305.03111; Aluri, R. (2024). Text-to-SQL excellence: An evaluation of Sonnet 3.5 and GPT-4o. Waii@Medium. https://blog.waii.ai/text-to-sql-excellence-an-evaluation-of-sonnet-3-5-and-gpt-4o-c52af5206ffc; Amaresh, & Reddy, R. (2024). Hermes: A text-to-SQL solution at Swiggy. Swiggy Bytes Tech Blog@Medium. https://bytes.swiggy.com/hermes-a-text-to-sql-solution-at-swiggy-81573fb4fb6e; https://hdl.handle.net/20.500.12495/14054

  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20