In conventional supervised training, a model is trained to fit all the training examples. However, having a monolithic model may not always be the best strategy, as examples could vary widely. In this work, we explore a different learning protocol that treats each example as a unique pseudo-task, by reducing the original learning problem to a few-shot meta-learning scenario with the help of a domain-dependent relevance function. When evaluated on theWikiSQL dataset, our approach leads to faster convergence and achieves 1.1%-5.4% absolute accuracy gains over the non-meta-learning counterparts.
CITATION STYLE
Huang, P. S., Wang, C., Singh, R., Yih, W. T., & He, X. (2018). Natural language to structured query generation via meta-learning. In NAACL HLT 2018 - 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference (Vol. 2, pp. 732–738). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/n18-2115
Mendeley helps you to discover research relevant for your work.