Improved Deep Unsupervised Hashing with Fine-grained Semantic Similarity Mining for Multi-Label Image Retrieval

20Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we study deep unsupervised hashing, a critical problem for approximate nearest neighbor research. Most recent methods solve this problem by semantic similarity reconstruction for guiding hashing network learning or contrastive learning of hash codes. However, in multi-label scenarios, these methods usually either generate an inaccurate similarity matrix without reflection of similarity ranking or suffer from the violation of the underlying assumption in contrastive learning, resulting in limited retrieval performance. To tackle this issue, we propose a novel method termed HAMAN, which explores semantics from a fine-grained view to enhance the ability of multi-label image retrieval. In particular, we reconstruct the pairwise similarity structure by matching fine-grained patch features generated by the pre-trained neural network, serving as reliable guidance for similarity preserving of hash codes. Moreover, a novel conditional contrastive learning on hash codes is proposed to adopt self-supervised learning in multi-label scenarios. According to extensive experiments on three multi-label datasets, the proposed method outperforms a broad range of state-of-the-art methods.

Cite

CITATION STYLE

APA

Ma, Z., Luo, X., Chen, Y., Hou, M., Li, J., Deng, M., & Lu, G. (2022). Improved Deep Unsupervised Hashing with Fine-grained Semantic Similarity Mining for Multi-Label Image Retrieval. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1254–1260). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/175

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free