In this paper, we present a graph-based Transformer for semantic parsing. We separate the semantic parsing task into two steps: 1) Use a sequence-to-sequence model to generate the logical form candidates. 2) Design a graph-based Transformer to rerank the candidates. To handle the structure of logical forms, we incorporate graph information to Transformer, and design a cross-candidate verification mechanism to consider all the candidates in the ranking process. Furthermore, we integrate BERT into our model and jointly train the graph-based Transformer and BERT. We conduct experiments on 3 semantic parsing benchmarks, ATIS, JOBS and Task Oriented semantic Parsing dataset (TOP). Experiments show that our graph-based reranking model achieves results comparable to state-of-the-art models on the ATIS and JOBS datasets. And on the TOP dataset, our model achieves a new state-of-the-art result.
CITATION STYLE
Shao, B., Gong, Y., Qi, W., Cao, G., Ji, J., & Lin, X. (2020). Graph-based transformer with cross-candidate verification for semantic parsing. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 8807–8814). AAAI press. https://doi.org/10.1609/aaai.v34i05.6408
Mendeley helps you to discover research relevant for your work.