UniK-QA: Unified Representations of Structured and Unstructured Knowledge for Open-Domain Question Answering

55Citations
Citations of this article
83Readers
Mendeley users who have this article in their library.

Abstract

We study open-domain question answering with structured, unstructured and semistructured knowledge sources, including text, tables, lists and knowledge bases. Departing from prior work, we propose a unifying approach that homogenizes all sources by reducing them to text and applies the retrieverreader model which has so far been limited to text sources only. Our approach greatly improves the results on knowledge-base QA tasks by 11 points, compared to latest graphbased methods. More importantly, we demonstrate that our unified knowledge (UniK-QA1) model is a simple and yet effective way to combine heterogeneous sources of knowledge, advancing the state-of-the-art results on two popular question answering benchmarks, NaturalQuestions and WebQuestions, by 3.5 and 2.6 points, respectively.

Cite

CITATION STYLE

APA

Oguz, B., Chen, X., Karpukhin, V., Peshterliev, S., Okhonko, D., Schlichtkrull, M., … Yih, W. T. (2022). UniK-QA: Unified Representations of Structured and Unstructured Knowledge for Open-Domain Question Answering. In Findings of the Association for Computational Linguistics: NAACL 2022 - Findings (pp. 1535–1546). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-naacl.115

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free