Classification with rejection: Scaling generative classifiers with supervised deep infomax

3Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

Deep Infomax (DIM) is an unsupervised representation learning framework by maximizing the mutual information between the inputs and the outputs of an encoder, while probabilistic constraints are imposed on the outputs. In this paper, we propose Supervised Deep InfoMax (SDIM), which introduces supervised probabilistic constraints to the encoder outputs. The supervised probabilistic constraints are equivalent to a generative classifier on high-level data representations, where class conditional log-likelihoods of samples can be evaluated. Unlike other works building generative classifiers with conditional generative models, SDIMs scale on complex datasets, and can achieve comparable performance with discriminative counterparts. With SDIM, we could perform classification with rejection. Instead of always reporting a class label, SDIM only makes predictions when test samples' largest class conditional surpass some pre-chosen thresholds, otherwise they will be deemed as out of the data distributions, and be rejected. Our experiments show that SDIM with rejection policy can effectively reject illegal inputs, including adversarial examples and out-of-distribution samples.

Cite

CITATION STYLE

APA

Wang, X., & Yiu, S. M. (2020). Classification with rejection: Scaling generative classifiers with supervised deep infomax. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2980–2986). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/412

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free