Shared representation generator for relation extraction with piecewise-LSTM convolutional neural networks

10Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Traditional distant supervision for relation extraction is faced with the problem of introducing noises. In this paper, we present a shared representation generator to de-emphasize the noisy expressions by extracting common features in relation. Different from computing weighted sum in widespread attention mechanism, we directly generate bag representation in multi-instance learning by feature transformation, which only remains the semantics related to predict relation. We introduce the generator loss into objective function to improve the performance of shared representation. Also, the structure of our proposed generator is flexible and scalable. To capture more structural information, piecewise convolutional neural network (PCNN) is widely used to divide the output of convolutional layer into three segments, but this approach breaks the consistence and inner relationship of the sentence. We encode the sentence with piecewise-LSTM convolutional neural network (PLSTM-CNN) to alleviate this issue, which adopts BiLSTM after the pooling layer of PCNN. The experimental results show that we achieve significant improvement on relation extraction as compared with the baselines.

Cite

CITATION STYLE

APA

Yan, D., & Hu, B. (2019). Shared representation generator for relation extraction with piecewise-LSTM convolutional neural networks. IEEE Access, 7, 31672–31680. https://doi.org/10.1109/ACCESS.2019.2892724

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free