Robust Retrieval Augmented Generation for Zero-shot Slot Filling

17Citations
Citations of this article
111Readers
Mendeley users who have this article in their library.

Abstract

Automatically inducing high quality knowledge graphs from a given collection of documents still remains a challenging problem in AI. One way to make headway for this problem is through advancements in a related task known as slot filling. In this task, given an entity query in form of [ENTITY, SLOT, ?], a system is asked to 'fill' the slot by generating or extracting the missing value exploiting evidence extracted from relevant passage(s) in the given document collection. The recent works in the field try to solve this task in an end-to-end fashion using retrieval-based language models. In this paper, we present a novel approach to zero-shot slot filling that extends dense passage retrieval with hard negatives and robust training procedures for retrieval augmented generation models. Our model reports large improvements on both T-REx and zsRE slot filling datasets, improving both passage retrieval and slot value generation, and ranking at the top-1 position in the KILT leaderboard. Moreover, we demonstrate the robustness of our system showing its domain adaptation capability on a new variant of the TACRED dataset for slot filling, through a combination of zero/few-shot learning. We release the source code and pre-trained models.

Cite

CITATION STYLE

APA

Glass, M., Rossiello, G., Chowdhury, M. F. M., & Gliozzo, A. (2021). Robust Retrieval Augmented Generation for Zero-shot Slot Filling. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 1939–1949). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.148

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free