Convolutional neural networks with compression complexity pooling for out-of-distribution image detection

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

To reliably detect out-of-distribution images based on already deployed convolutional neural networks, several recent studies on the out-of-distribution detection have tried to define effective confidence scores without retraining the model. Although they have shown promising results, most of them need to find the optimal hyperparameter values by using a few out-of-distribution images, which eventually assumes a specific test distribution and makes it less practical for real-world applications. In this work, we propose a novel out-of-distribution detection method termed as MALCOM, which neither uses any out-of-distribution sample nor retrains the model. Inspired by an observation that the global average pooling cannot capture spatial information of feature maps in convolutional neural networks, our method aims to extract informative sequential patterns from the feature maps. To this end, we introduce a similarity metric that focuses on shared patterns between two sequences based on the normalized compression distance. In short, MALCOM uses both the global average and the spatial patterns of feature maps to identify out-of-distribution images accurately.

Cite

CITATION STYLE

APA

Yu, S., Lee, D., & Yu, H. (2020). Convolutional neural networks with compression complexity pooling for out-of-distribution image detection. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2435–2441). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/337

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free