Masked Contrastive Learning for Anomaly Detection

14Citations
Citations of this article
81Readers
Mendeley users who have this article in their library.

Abstract

Detecting anomalies is one fundamental aspect of a safety-critical software system, however, it remains a long-standing problem. Numerous branches of works have been proposed to alleviate the complication and have demonstrated their efficiencies. In particular, self-supervised learning based methods are spurring interest due to their capability of learning diverse representations without additional labels. Among self-supervised learning tactics, contrastive learning is one specific framework validating their superiority in various fields, including anomaly detection. However, the primary objective of contrastive learning is to learn task-agnostic features without any labels, which is not entirely suited to discern anomalies. In this paper, we propose a task-specific variant of contrastive learning named masked contrastive learning, which is more befitted for anomaly detection. Moreover, we propose a new inference method dubbed self-ensemble inference that further boosts performance by leveraging the ability learned through auxiliary self-supervision tasks. By combining our models, we can outperform previous state-of-the-art methods by a significant margin on various benchmark datasets.

Cite

CITATION STYLE

APA

Cho, H., Seol, J., & Lee, S. G. (2021). Masked Contrastive Learning for Anomaly Detection. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1434–1441). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/198

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free