Human conversations can evolve in many different ways, creating challenges for automatic understanding and summarization. Goal-oriented conversations often have meaningful sub-dialogue structure, but it can be highly domain-dependent. This work introduces an unsupervised approach to learning hierarchical conversation structure, including turn and sub-dialogue segment labels, corresponding roughly to dialogue acts and sub-tasks, respectively. The decoded structure is shown to be useful in enhancing neural models of language for three conversation-level understanding tasks. Further, the learned finite-state sub-dialogue network is made interpretable through automatic summarization.
CITATION STYLE
Lu, B. R., Hu, Y., Cheng, H., Smith, N. A., & Ostendorf, M. (2022). Unsupervised Learning of Hierarchical Conversation Structure. In Findings of the Association for Computational Linguistics: EMNLP 2022 (pp. 5686–5699). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-emnlp.415
Mendeley helps you to discover research relevant for your work.