Using Dilated Residual Network to Model Distantly Supervised Relation Extraction

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Distantly supervised relation extraction has been widely used to find relational facts in the text. However, distant supervision inevitably brings in noise that can lead to a bad relation contextual representation. In this paper, we propose a deep dilated residual network (DRN) model to address the noise of in distantly supervised relation extraction. Specifically, we design a module which employs dilated convolution in cascade to capture multi-scale context features by adopting multiple dilation rates. By combining them with residual learning, the model is more powerful than traditional CNN model. Our model significantly improves the performance for distantly supervised relation extraction on the large NYT-Freebase dataset compared to various baselines.

Cite

CITATION STYLE

APA

Zhan, L., Yang, Y., Zhu, P., He, L., & Yu, Z. (2019). Using Dilated Residual Network to Model Distantly Supervised Relation Extraction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11448 LNCS, pp. 500–504). Springer Verlag. https://doi.org/10.1007/978-3-030-18590-9_75

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free