Abstract
Sequential recurrent neural networks have achieved superior performance on language modeling, but overlook the structure information in natural language. Recent works on structure-aware models have shown promising results on language modeling. However, how to incorporate structure knowledge on corpus without syntactic annotations remains an open problem. In this work, we propose neural variational language model (NVLM), which enables the sharing of grammar knowledge among different corpora. Experimental results demonstrate the effectiveness of our framework on two popular benchmark datasets. With the help of shared grammar, our language model converges significantly faster to a lower perplexity on new training corpus.
Cite
CITATION STYLE
Zhang, Y., & Song, L. (2020). Language modeling with shared grammar. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 4442–4453). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1437
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.