Abstract
This paper proposes to adapt self-attention to discourse level for modeling discourse elements in argumentative student essays. Specifically, we focus on two issues. First, we propose structural sentence positional encodings to explicitly represent sentence positions. Second, we propose to use inter-sentence attentions to capture sentence interactions and enhance sentence representation. We conduct experiments on two datasets: a Chinese dataset and an English dataset. We find that (i) sentence positional encodings can lead to a large improvement for identifying discourse elements; (ii) a structural relative positional encoding of sentences shows to be most effective; (iii) inter-sentence attention vectors are useful as a kind of sentence representation for identifying discourse elements.
Cite
CITATION STYLE
Song, W., Song, Z., Fu, R., Liu, L., Cheng, M., & Liu, T. (2020). Discourse self-attention for discourse element identification in argumentative student essays. In EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 2820–2830). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-main.225
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.