Regularizing relation representations by first-order implications

8Citations
Citations of this article
92Readers
Mendeley users who have this article in their library.

Abstract

Methods for automated knowledge base construction often rely on trained fixed-length vector representations of relations and entities to predict facts. Recent work showed that such representations can be regularized to inject first-order logic formulae. This enables to incorporate domain-knowledge for improved prediction of facts, especially for uncommon relations. However, current approaches rely on propositionalization of formulae and thus do not scale to large sets of formulae or knowledge bases with many facts. Here we propose a method that imposes first-order constraints directly on relation representations, avoiding costly grounding of formulae. We show that our approach works well for implications between pairs of relations on artificial datasets.

Cite

CITATION STYLE

APA

Demeester, T., Rocktäschel, T., & Riedel, S. (2016). Regularizing relation representations by first-order implications. In Proceedings of the 5th Workshop on Automated Knowledge Base Construction, AKBC 2016 at the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2016 (pp. 75–80). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w16-1314

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free