Beyond word attention: Using segment attention in neural relation extraction

50Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.

Abstract

Relation extraction studies the issue of predicting semantic relations between pairs of entities in sentences. Attention mechanisms are often used in this task to alleviate the inner-sentence noise by performing soft selections of words independently. Based on the observation that information pertinent to relations is usually contained within segments (continuous words in a sentence), it is possible to make use of this phenomenon for better extraction. In this paper, we aim to incorporate such segment information into neural relation extractor. Our approach views the attention mechanism as linear-chain conditional random fields over a set of latent variables whose edges encode the desired structure, and regards attention weight as the marginal distribution of each word being selected as a part of the relational expression. Experimental results show that our method can attend to continuous relational expressions without explicit annotations, and achieve the state-of-the-art performance on the large-scale TACRED dataset.

Cite

CITATION STYLE

APA

Yu, B., Zhang, Z., Liu, T., Wang, B., Li, S., & Li, Q. (2019). Beyond word attention: Using segment attention in neural relation extraction. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 5401–5407). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/750

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free