Robust Learning by Self-Transition for Handling Noisy Labels

29Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Real-world data inevitably contains noisy labels, which induce the poor generalization of deep neural networks. It is known that the network typically begins to rapidly memorize false-labeled samples after a certain point of training. Thus, to counter the label noise challenge, we propose a novel self-transitional learning method called MORPH, which automatically switches its learning phase at the transition point from seeding to evolution. In the seeding phase, the network is updated using all the samples to collect a seed of clean samples. Then, in the evolution phase, the network is updated using only the set of arguably clean samples, which precisely keeps expanding by the updated network. Thus, MORPH effectively avoids the overfitting to false-labeled samples throughout the entire training period. Extensive experiments using five real-world or synthetic benchmark datasets demonstrate substantial improvements over state-of-the-art methods in terms of robustness and efficiency.

Cite

CITATION STYLE

APA

Song, H., Kim, M., Park, D., Shin, Y., & Lee, J. G. (2021). Robust Learning by Self-Transition for Handling Noisy Labels. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 1490–1500). Association for Computing Machinery. https://doi.org/10.1145/3447548.3467222

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free