Laprel: A label-aware parallel network for relation extraction

3Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Relation extraction is a crucial task in natural language processing (NLP) that aims to extract all relational triples from a given sentence. Extracting overlapping relational triples from complex texts is challenging and has received extensive research attention. Most existing methods are based on cascade models and employ language models to transform the given sentence into vectorized representations. The cascaded structure can cause exposure bias issue; however, the vectorized representation of each sentence needs to be closely related to the relation extraction with pre-defined relation types. In this paper, we propose a label-aware parallel network (LAPREL) for relation extraction. To solve the exposure bias issue, we apply a parallel network, instead of the cascade framework, based on the table-filling method with a symmetric relation pair tagger. To obtain task-related sentence embedding, we embed the prior label information into the token embedding and adjust the sentence embedding for each relation type. The proposed method can also effectively deal with overlapping relational triples. Compared with 10 baselines, extensive experiments are conducted on two public datasets to verify the performance of our proposed network. The experimental results show that LAPREL outperforms the 10 baselines in extracting relational triples from complex text.

Cite

CITATION STYLE

APA

Li, X., Yang, J., Hu, P., & Liu, H. (2021). Laprel: A label-aware parallel network for relation extraction. Symmetry, 13(6). https://doi.org/10.3390/sym13060961

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free