TransKGQA: Enhanced Knowledge Graph Question Answering With Sentence Transformers

9Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Knowledge Graph Question Answering (KGQA) plays a crucial role in extracting valuable insights from interconnected information. Existing methods, while commendable, face challenges such as contextual ambiguity and limited adaptability to diverse knowledge domains. This paper introduces TransKGQA, a novel approach addressing these challenges. Leveraging Sentence Transformers, TransKGQA enhances contextual understanding, making it adaptable to various knowledge domains. The model employs question-answer pair augmentation for robustness and introduces a threshold mechanism for reliable answer retrieval. TransKGQA overcomes limitations in existing works by offering a versatile solution for diverse question types. Experimental results, notably with the sentence-transformers/all-MiniLM-L12-v2 model, showcase remarkable performance with an F1 score of 78%. This work advances KGQA systems, contributing to knowledge graph construction, enhanced question answering, and automated Cypher query execution.

Cite

CITATION STYLE

APA

Li Chong, Y., Poo Lee, C., Zen Muhd-Yassin, S., Ming Lim, K., & Kamsani Samingan, A. (2024). TransKGQA: Enhanced Knowledge Graph Question Answering With Sentence Transformers. IEEE Access, 12, 74872–74887. https://doi.org/10.1109/ACCESS.2024.3405583

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free