Committee Chair

Xie, Mengjun

Committee Member

Sakib, Shahnewaz Karim; Liang, Yu

Department

Dept. of Computer Science and Engineering

College

College of Engineering and Computer Science

Publisher

University of Tennessee at Chattanooga

Place of Publication

Chattanooga (Tenn.)

Abstract

Knowledge Graph Question Answering (KGQA) pipelines commonly depend on separate entity and relation predictors before querying the graph, which introduces engineering complexity and costly inference passes over large vocabularies. This thesis presents a drop-in replacement for those modules: a fine-tuned large language model (LLM) that translates a natural-language question directly into an executable SPARQL query. We fine-tune instruction-tuned backbones, Llama-3.1-8B-Instruct and Mistral-7B-Instruct, on paired (question, gold SPARQL) examples, which are formatted through chat templates. As a result, the models can perform single-step query generation. The training and inference pipeline includes a lightweight post-processor that corrects tokenizer-induced spacing artifacts in generated SPARQL, improving exact-match robustness without altering query structure. On a held-out test set, the fine-tuned models achieve 97.9% (Llama) and 94.0% (Mistral) exact-match accuracy for natural-language-to-SPARQL generation, demonstrating that an end-to-end translator can meet or exceed the accuracy of typical multi-module KGQA stacks while substantially simplifying the architecture.

Degree

M. S.; A thesis submitted to the faculty of the University of Tennessee at Chattanooga in partial fulfillment of the requirements of the degree of Master of Science.

Date

12-2025

Subject

Question-answering systems; Semantic networks (Information theory); SPARQL (Computer program language)

Keyword

LLM; Knowledge Graph; KGQA; ForensiQ; Finetuning; SPARQL

Document Type

Masters theses

DCMI Type

Text

Extent

x, 35 leaves

Language

English

Rights

http://rightsstatements.org/vocab/InC/1.0/

License

http://creativecommons.org/licenses/by/4.0/

Share

COinS