Learning dependency-based compositional semantics

173Citations
Citations of this article
827Readers
Mendeley users who have this article in their library.

Abstract

Suppose we want to build a system that answers a natural language question by representing its semantics as a logical form and computing the answer given a structured database of facts. The core part of such a system is the semantic parser that maps questions to logical forms. Semantic parsers are typically trained from examples of questions annotated with their target logical forms, but this type of annotation is expensive. Our goal is to instead learn a semantic parser from question-answer pairs, where the logical form is modeled as a latent variable. We develop a new semantic formalism, dependency-based compositional semantics (DCS) and define a log-linear distribution over DCS logical forms. The model parameters are estimated using a simple procedure that alternates between beam search and numerical optimization. On two standard semantic parsing benchmarks, we show that our system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. © 2013 Association for Computational Linguistics.

Cite

CITATION STYLE

APA

Liang, P., Jordan, M. I., & Klein, D. (2013). Learning dependency-based compositional semantics. Computational Linguistics, 39(2), 389–446. https://doi.org/10.1162/COLI_a_00127

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free