Coupling distant annotation and adversarial training for cross-domain Chinese word segmentation

15Citations
Citations of this article
100Readers
Mendeley users who have this article in their library.

Abstract

Fully supervised neural approaches have achieved significant progress in the task of Chinese word segmentation (CWS). Nevertheless, the performance of supervised models tends to drop dramatically when they are applied to out-of-domain data. Performance degradation is caused by the distribution gap across domains and the out of vocabulary (OOV) problem. In order to simultaneously alleviate these two issues, this paper proposes to couple distant annotation and adversarial training for cross-domain CWS. For distant annotation, we rethink the essence of “Chinese words” and design an automatic distant annotation mechanism that does not need any supervision or pre-defined dictionaries from the target domain. The approach could effectively explore domain-specific words and distantly annotate the raw texts for the target domain. For adversarial training, we develop a sentence-level training procedure to perform noise reduction and maximum utilization of the source domain information. Experiments on multiple real-world datasets across various domains show the superiority and robustness of our model, significantly outperforming previous state-of-the-art cross-domain CWS methods.

Cite

CITATION STYLE

APA

Ding, N., Long, D., Xu, G., Zhu, M., Xie, P., Wang, X., & Zheng, H. T. (2020). Coupling distant annotation and adversarial training for cross-domain Chinese word segmentation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 6662–6671). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.595

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free