Weakly Supervised Training of Semantic Parsers

  • Krishnamurthy J
  • Mitchell T
  • 97


    Mendeley users who have this article in their library.
  • 48


    Citations of this article.


We present a method for training a semantic parser using only a knowledge base and an unlabeled text corpus, without any individually annotated sentences. Our key observation is that multiple forms of weak supervision can be combined to train an accurate semantic parser: semantic supervision from a knowledge base, and syntactic supervision from dependency- parsed sentences. We apply our approach to train a semantic parser that uses 77 relations from Freebase in its knowledge representation. This semantic parser extracts instances of binary relations with state-of-the- art accuracy, while simultaneously recovering much richer semantic structures, such as conjunctions of multiple relations with partially shared arguments. We demonstrate recovery of this richer structure by extracting logical forms from natural language queries against Freebase. On this task, the trained semantic parser achieves 80% precision and 56% recall, despite never having seen an annotated logical form.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

  • ISBN: 9781937284435
  • SCOPUS: 2-s2.0-84883409094
  • PUI: 369719762
  • SGR: 84883409094


  • Jayant Krishnamurthy

  • Tom M Mitchell

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free