Improved differentiable architecture search for language modeling and named entity recognition

83Citations
Citations of this article
153Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we study differentiable neural architecture search (NAS) methods for natural language processing. In particular, we improve differentiable architecture search by removing the softmax-local constraint. Also, we apply differentiable NAS to named entity recognition (NER). It is the first time that differentiable NAS methods are adopted in NLP tasks other than language modeling. On both the PTB language modeling and CoNLL-2003 English NER data, our method outperforms strong baselines. It achieves a new state-ofthe-art on the NER task.

Cite

CITATION STYLE

APA

Jiang, Y., Hu, C., Xiao, T., Zhang, C., & Zhu, J. (2019). Improved differentiable architecture search for language modeling and named entity recognition. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 3585–3590). Association for Computational Linguistics. https://doi.org/10.18653/v1/D19-1367

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free