Discourse sense classification from scratch using focused RNNs

7Citations
Citations of this article
72Readers
Mendeley users who have this article in their library.

Abstract

The subtask of CoNLL 2016 Shared Task focuses on sense classification of multilingual shallow discourse relations. Existing systems rely heavily on external resources, hand-engineered features, patterns, and complex pipelines fine-tuned for the English language. In this paper we describe a different approach and system inspired by end-to-end training of deep neural networks. Its input consists of only sequences of tokens, which are processed by our novel focused RNNs layer, and followed by a dense neural network for classification. Neural networks implicitly learn latent features useful for discourse relation sense classification, make the approach almost language-agnostic and independent of prior linguistic knowledge. In the closed-track sense classification task our system achieved overall 0.5246 F1-measure on English blind dataset and achieved the new state-of-the-art of 0.7292 F1-measure on Chinese blind dataset.

Cite

CITATION STYLE

APA

Weiss, G., & Bajec, M. (2016). Discourse sense classification from scratch using focused RNNs. In Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning: Shared Task, CoNLL 2016 (pp. 50–54). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/k16-2006

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free