Optimizing Academic Queries With Retrieval-Augmented Large Language Models

Authors

  • Ayesha Khaliq
  • Sikander Bakht Abbasi
  • Arslan Ilyas
  • Saim Masood Shaikh
  • Syed Ashar Ali

DOI:

https://doi.org/10.59670/ml.v21is10.11858

Abstract

This research investigates the application of Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) methods to enhance the management of academic queries, providing advantages for both students and educators. The study aims to improve the precision and pertinence of generated answers by combining LLMs with multi-source RAG systems. The model employs PDF datasets of various sizes and incorporates vector database support to streamline storage and retrieval, thereby boosting the model's capacity to handle extensive datasets. To process and produce comprehensive responses, the research utilizes the LLaMA model, which features a parameter range from 7 billion to 65 billion. The study addresses challenges such as textual imperfections in data retrieval, which can significantly impact the model's output. To ensure its robustness, the proposed model is evaluated using a diverse set of academic inquiries. Beyond answering course-related questions, the model also supports international students by providing information on scholarship opportunities and admission guidelines. This research contributes to the advancement of academic research tools by merging retrieval-augmented techniques with sophisticated LLMs, paving the way for future studies in education and generative AI.

Downloads

Published

2024-05-08

How to Cite

Khaliq, A. ., Abbasi, S. B. ., Ilyas, A. ., Shaikh, S. M. ., & Ali, S. A. . (2024). Optimizing Academic Queries With Retrieval-Augmented Large Language Models. Migration Letters, 21(S10), 1274–1283. https://doi.org/10.59670/ml.v21is10.11858

Issue

Section

Special Dossier