Open-vocabulary semantic parsing with both distributional statistics and formal knowledge

13Citations
Citations of this article
56Readers
Mendeley users who have this article in their library.

Abstract

Traditional semantic parsers map language onto compositional, executable queries in a fixed schema. This mapping allows them to effectively leverage the information contained in large, formal knowledge bases (KBs, e.g., Freebase) to answer questions, but it is also fundamentally limiting - these semantic parsers can only assign meaning to language that falls within the KB's manually-produced schema. Recently proposed methods for open vocabulary semantic parsing overcome this limitation by learning execution models for arbitrary language, essentially using a text corpus as a kind of knowledge base. However, all prior approaches to open vocabulary semantic parsing replace a formal KB with textual information, making no use of the KB in their models. We show how to combine the disparate representations used by these two approaches, presenting for the first time a semantic parser that (1) produces compositional, executable representations of language, (2) can successfully leverage the information contained in both a formal KB and a large corpus, and (3) is not limited to the schema of the underlying KB. We demonstrate significantly improved performance over state-of-the-art baselines on an open-domain natural language question answering task.

Cite

CITATION STYLE

APA

Gardner, M., & Krishnamurthy, J. (2017). Open-vocabulary semantic parsing with both distributional statistics and formal knowledge. In 31st AAAI Conference on Artificial Intelligence, AAAI 2017 (pp. 3195–3201). AAAI press. https://doi.org/10.1609/aaai.v31i1.10980

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free