MILIE: Modular & Iterative Multilingual Open Information Extraction

17Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.

Abstract

Open Information Extraction (OpenIE) is the task of extracting (subject, predicate, object) triples from natural language sentences. Current OpenIE systems extract all triple slots independently. In contrast, we explore the hypothesis that it may be beneficial to extract triple slots iteratively: first extract easy slots, followed by the difficult ones by conditioning on the easy slots, and therefore achieve a better overall extraction. Based on this hypothesis, we propose a neural OpenIE system, MILIE, that operates in an iterative fashion. Due to the iterative nature, the system is also modular-it is possible to seamlessly integrate rule based extraction systems with a neural end-to-end system, thereby allowing rule based systems to supply extraction slots which MILIE can leverage for extracting the remaining slots. We confirm our hypothesis empirically: MILIE outperforms SOTA systems on multiple languages ranging from Chinese to Arabic. Additionally, we are the first to provide an OpenIE test dataset for Arabic and Galician.

Cite

CITATION STYLE

APA

Kotnis, B., Gashteovski, K., Oñoro-Rubio, D., Shaker, A., Rodriguez-Tembras, V., Takamoto, M., … Lawrence, C. (2022). MILIE: Modular & Iterative Multilingual Open Information Extraction. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 6939–6950). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.478

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free