Learning from Noisy Labels with Complementary Loss Functions

28Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

Recent researches reveal that deep neural networks are sensitive to label noises hence leading to poor generalization performance in some tasks. Although different robust loss functions have been proposed to remedy this issue, they suffer from an underfitting problem, thus are not sufficient to learn accurate models. On the other hand, the commonly used Cross Entropy (CE) loss, which shows high performance in standard supervised learning (with clean supervision), is non-robust to label noise. In this paper, we propose a general framework to learn robust deep neural networks with complementary loss functions. In our framework, CE and robust loss play complementary roles in a joint learning objective as per their learning sufficiency and robustness properties respectively. Specifically, we find that by exploiting the memorization effect of neural networks, we can easily filter out a proportion of hard samples and generate reliable pseudo labels for easy samples, and thus reduce the label noise to a quite low level. Then, we simply learn with CE on pseudo supervision and robust loss on original noisy supervision. In this procedure, CE can guarantee the sufficiency of optimization while the robust loss can be regarded as the supplement. Experimental results on benchmark classification datasets indicate that the proposed method helps achieve robust and sufficient deep neural network training simultaneously.

Cite

CITATION STYLE

APA

Wang, D. B., Wen, Y., Pan, L., & Zhang, M. L. (2021). Learning from Noisy Labels with Complementary Loss Functions. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 11B, pp. 10111–10119). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i11.17213

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free