Online Coreference Resolution for Dialogue Processing: Improving Mention-Linking on Real-Time Conversations

0Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

Abstract

This paper suggests a direction of coreference resolution for online decoding on actively generated input such as dialogue, where the model accepts an utterance and its past context, then finds mentions in the current utterance as well as their referents, upon each dialogue turn. A baseline and four incremental-updated models adapted from the mention-linking paradigm are proposed for this new setting, which address different aspects including the singletons, speaker-grounded encoding and cross-turn mention contextualization. Our approach is assessed on three datasets: Friends, OntoNotes, and BOLT. Results show that each aspect brings out steady improvement, and our best models outperform the baseline by over 10%, presenting an effective system for this setting. Further analysis highlights the task characteristics, such as the significance of addressing the mention recall.

Cite

CITATION STYLE

APA

Xu, L., & Choi, J. D. (2022). Online Coreference Resolution for Dialogue Processing: Improving Mention-Linking on Real-Time Conversations. In *SEM 2022 - 11th Joint Conference on Lexical and Computational Semantics, Proceedings of the Conference (pp. 341–347). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.starsem-1.30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free