A survey of class-imbalanced semi-supervised learning

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Semi-supervised learning(SSL) can substantially improve the performance of deep neural networks by utilizing unlabeled data when labeled data is scarce. The state-of-the-art(SOTA) semi-supervised algorithms implicitly assume that the class distribution of labeled datasets and unlabeled datasets are balanced, which means the different classes have the same numbers of training samples. However, they can hardly perform well on minority classes when the class distribution of training data is imbalanced. Recent work has found several ways to decrease the degeneration of semi-supervised learning models in class-imbalanced learning. In this article, we comprehensively review class-imbalanced semi-supervised learning (CISSL), starting with an introduction to this field, followed by a realistic evaluation of existing class-imbalanced semi-supervised learning algorithms and a brief summary of them.

Cite

CITATION STYLE

APA

Gui, Q., Zhou, H., Guo, N., & Niu, B. (2024). A survey of class-imbalanced semi-supervised learning. Machine Learning, 113(8), 5057–5086. https://doi.org/10.1007/s10994-023-06344-7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free