Contrastive Quantization with Code Memory for Unsupervised Image Retrieval

48Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

The high efficiency in computation and storage makes hashing (including binary hashing and quantization) a common strategy in large-scale retrieval systems. To alleviate the reliance on expensive annotations, unsupervised deep hashing becomes an important research problem. This paper provides a novel solution to unsupervised deep quantization, namely Contrastive Quantization with Code Memory (MeCoQ). Different from existing reconstruction-based strategies, we learn unsupervised binary descriptors by contrastive learning, which can better capture discriminative visual semantics. Besides, we uncover that codeword diversity regularization is critical to prevent contrastive learning-based quantization from model degeneration. Moreover, we introduce a novel quantization code memory module that boosts contrastive learning with lower feature drift than conventional feature memories. Extensive experiments on benchmark datasets show that MeCoQ outperforms state-of-the-art methods. Code and configurations are publicly released.

Cite

CITATION STYLE

APA

Wang, J., Zeng, Z., Chen, B., Dai, T., & Xia, S. T. (2022). Contrastive Quantization with Code Memory for Unsupervised Image Retrieval. In Proceedings of the 36th AAAI Conference on Artificial Intelligence, AAAI 2022 (Vol. 36, pp. 2468–2476). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v36i3.20147

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free