For multi-turn dialogue rewriting, the capacity of effectively modeling the linguistic knowledge in dialog context and getting rid of the noises is essential to improve its performance. Existing attentive models attend to all words without prior focus, which results in inaccurate concentration on some dispensable words. In this paper, we propose to use semantic role labeling (SRL), which highlights the core semantic information of who did what to whom, to provide additional guidance for the rewriter model. Experiments show that this information significantly improves a RoBERTa-based model that already outperforms previous state-of-the-art systems.
CITATION STYLE
Xu, K., Tan, H., Song, L., Wu, H., Zhang, H., Song, L., & Yu, D. (2020). Semantic role labeling guided multi-turn dialogue rewriter. In EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 6632–6639). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-main.537
Mendeley helps you to discover research relevant for your work.