Rank-Aware Negative Training for Semi-Supervised Text Classification

9Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Semi-supervised text classification-based par¬adigms (SSTC) typically employ the spirit of self-training. The key idea is to train a deep classifier on limited labeled texts and then it¬eratively predict the unlabeled texts as their pseudo-labels for further training. However, the performance is largely affected by the ac-curacy of pseudo-labels, which may not be significant in real-world scenarios. This pa¬per presents a Rank-aware Negative Training (RNT) framework to address SSTC in learn¬ing with noisy label settings. To alleviate the noisy information, we adapt a reasoning with uncertainty-based approach to rank the unla¬beled texts based on the evidential support received from the labeled texts. Moreover, we propose the use of negative training to train RNT based on the concept that ‘‘the input in¬stance does not belong to the complementary label’’. A complementary label is randomly se¬lected from all labels except the label on-target. Intuitively, the probability of a true label serv¬ing as a complementary label is low and thus provides less noisy information during the training, resulting in better performance on the test data. Finally, we evaluate the proposed so¬lution on various text classification benchmark datasets. Our extensive experiments show that it consistently overcomes the state-of-the-art alternatives in most scenarios and achieves competitive performance in the others. The code of RNT is publicly available on GitHub.

Cite

CITATION STYLE

APA

Murtadha, A., Pan, S., Bo, W., Su, J., Cao, X., Zhang, W., & Liu, Y. (2023). Rank-Aware Negative Training for Semi-Supervised Text Classification. Transactions of the Association for Computational Linguistics, 11, 771–786. https://doi.org/10.1162/tacl_a_00574

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free