Multi-task attention-based neural networks for implicit discourse relationship representation and identification

105Citations
Citations of this article
136Readers
Mendeley users who have this article in their library.

Abstract

We present a novel multi-task attention-based neural network model to address implicit discourse relationship representation and identification through two types of representation learning, an attention-based neural network for learning discourse relationship representation with two arguments and a multi-task framework for learning knowledge from annotated and unannotated corpora. The extensive experiments have been performed on two benchmark corpora (i.e., PDTB and CoNLL-2016 datasets). Experimental results show that our proposed model outperforms the state-of-the-art systems on benchmark corpora.

Cite

CITATION STYLE

APA

Lan, M., Wang, J., Wu, Y., Niu, Z. Y., & Wang, H. (2017). Multi-task attention-based neural networks for implicit discourse relationship representation and identification. In EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 1299–1308). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d17-1134

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free