Towards accurate and robust domain adaptation under noisy environments

21Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In non-stationary environments, learning machines usually confront the domain adaptation scenario where the data distribution does change over time. Previous domain adaptation works have achieved great success in theory and practice. However, they always lose robustness in noisy environments where the labels and features of examples from the source domain become corrupted. In this paper, we report our attempt towards achieving accurate noise-robust domain adaptation. We first give a theoretical analysis that reveals how harmful noises influence unsupervised domain adaptation. To eliminate the effect of label noise, we propose an offline curriculum learning for minimizing a newly-defined empirical source risk. To reduce the impact of feature noise, we propose a proxy distribution based margin discrepancy. We seamlessly transform our methods into an adversarial network that performs efficient joint optimization for them, successfully mitigating the negative influence from both data corruption and distribution shift. A series of empirical studies show that our algorithm remarkably outperforms state of the art, over 10% accuracy improvements in some domain adaptation tasks under noisy environments.

Cite

CITATION STYLE

APA

Han, Z., Gui, X. J., Cui, C., & Yin, Y. (2020). Towards accurate and robust domain adaptation under noisy environments. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2269–2276). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/314

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free