Decoupled Dialogue Modeling and Semantic Parsing for Multi-Turn Text-to-SQL

16Citations
Citations of this article
68Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently, Text-to-SQL for multi-turn dialogue has attracted great interest. Here, the user input of the current turn is parsed into the corresponding SQL query of the appropriate database, given all previous dialogue history. Current approaches mostly employ end-to-end models and consequently face two challenges. First, dialogue history modeling and Text-to-SQL parsing are implicitly combined, hence it is hard to carry out interpretable analysis and obtain targeted improvement. Second, SQL annotation of multi-turn dialogue is very expensive, leading to training data sparsity. In this paper, we propose a novel decoupled multi-turn Text-to-SQL framework, where an utterance rewrite model first explicitly solves completion of dialogue context, and then a single-turn Text-to-SQL parser follows. A dual learning approach is also proposed for the utterance rewrite model to address the data sparsity problem. Compared with end-to-end approaches, the proposed decoupled method can achieve excellent performance without any annotated in-domain data. With just a few annotated rewrite cases, the decoupled method outperforms the released state-of-the-art end-to-end models on both SParC and CoSQL datasets.

Cite

CITATION STYLE

APA

Chen, Z., Chen, L., Li, H., Cao, R., Ma, D., Wu, M., & Yu, K. (2021). Decoupled Dialogue Modeling and Semantic Parsing for Multi-Turn Text-to-SQL. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 3063–3074). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.270

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free