DyGen: Learning from Noisy Labels via Dynamics-Enhanced Generative Modeling

1Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Learning from noisy labels is a challenge that arises in many real-world applications where training data can contain incorrect or corrupted labels. When fine-tuning language models with noisy labels, models can easily overfit the label noise, leading to decreased performance. Most existing methods for learning from noisy labels use static input features for denoising, but these methods are limited by the information they can provide on true label distributions and can result in biased or incorrect predictions. In this work, we propose the Dynamics-Enhanced Generative Model (DyGen), which uses dynamic patterns in the embedding space during the fine-tuning process of language models to improve noisy label predictions. DyGen uses the variational auto-encoding framework to infer the posterior distributions of true labels from noisy labels and training dynamics. Additionally, a co-regularization mechanism is used to minimize the impact of potentially noisy labels and priors. DyGen demonstrates an average accuracy improvement of 3.10% on two synthetic noise datasets and 1.48% on three real-world noise datasets compared to the previous state-of-the-art. Extensive experiments and analyses show the effectiveness of each component in DyGen. Our code is available for reproducibility on GitHub.

Cite

CITATION STYLE

APA

Zhuang, Y., Yu, Y., Kong, L., Chen, X., & Zhang, C. (2023). DyGen: Learning from Noisy Labels via Dynamics-Enhanced Generative Modeling. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 3674–3686). Association for Computing Machinery. https://doi.org/10.1145/3580305.3599318

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free