Fact Checking Machine Generated Text with Dependency Trees

5Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Factual and logical errors made by Natural Language Generation (NLG) systems limit their applicability in many settings. We study this problem in a conversational search and recommendation setting, and observe that we can often make two simplifying assumptions in this domain: (i) there exists a body of structured knowledge we can use for verifying factuality of generated text; and (ii) the text to be factually assessed typically has a well-defined structure and style. Grounded in these assumptions, we propose a fast, unsupervised and explainable technique, DepChecker, that assesses factuality of input text based on rules derived from structured knowledge patterns and dependency relations with respect to the input text. We show that DepChecker outperforms state-of-the-art, general purpose fact-checking techniques in this special, but important case.

Cite

CITATION STYLE

APA

Estes, A., Vedula, N., Collins, M., Cecil, M., & Rokhlenko, O. (2022). Fact Checking Machine Generated Text with Dependency Trees. In EMNLP 2022 - Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track (pp. 468–476). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-industry.46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free