SEE: Syntax-aware entity embedding for neural relation extraction

42Citations
Citations of this article
107Readers
Mendeley users who have this article in their library.

Abstract

Distant supervised relation extraction is an efficient approach to scale relation extraction to very large corpora, and has been widely used to find novel relational facts from plain text. Recent studies on neural relation extraction have shown great progress on this task via modeling the sentences in low-dimensional spaces, but seldom considered syntax information to model the entities. In this paper, we propose to learn syntax-aware entity embedding for neural relation extraction. First, we encode the context of entities on a dependency tree as sentence-level entity embedding based on tree-GRU. Then, we utilize both intra-sentence and inter-sentence attentions to obtain sentence set-level entity embedding over all sentences containing the focus entity pair. Finally, we combine both sentence embedding and entity embedding for relation classification. We conduct experiments on a widely used real-world dataset and the experimental results show that our model can make full use of all informative instances and achieve state-of-the-art performance of relation extraction.

Cite

CITATION STYLE

APA

He, Z., Chen, W., Li, Z., Zhang, M., Zhang, W., & Zhang, M. (2018). SEE: Syntax-aware entity embedding for neural relation extraction. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 5795–5802). AAAI press. https://doi.org/10.1609/aaai.v32i1.12042

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free