Multi-source domain adaptation with mixture of experts

115Citations
Citations of this article
255Readers
Mendeley users who have this article in their library.

Abstract

We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.1

Cite

CITATION STYLE

APA

Guo, J., Shah, D. J., & Barzilay, R. (2018). Multi-source domain adaptation with mixture of experts. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018 (pp. 4694–4703). Association for Computational Linguistics. https://doi.org/10.18653/v1/d18-1498

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free