Adversary guided asymmetric hashing for cross-modal retrieval

129Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.

Abstract

Cross-modal hashing has attracted considerable attention for largescale multimodal retrieval task. A majority of hashing methods have been proposed for cross-modal retrieval. However, these methods inadequately focus on feature learning process and cannot fully preserve higher-ranking correlation of various item pairs as well as the multi-label semantics of each item, so that the quality of binary codes may be downgraded. To tackle these problems, in this paper, we propose a novel deep cross-modal hashing method, called Adversary Guided Asymmetric Hashing (AGAH). Specifically, it employs an adversarial learning guided multi-label attention module to enhance the feature learning part which can learn discriminative feature representations and keep the cross-modal invariability. Furthermore, in order to generate hash codes which can fully preserve the multi-label semantics of all items, we propose an asymmetric hashing method which utilizes a multi-label binary code map that can equip the hash codes with multi-label semantic information. In addition, to ensure higher-ranking correlation of all similar item pairs than those of dissimilar ones, we adopt a new triplet-margin constraint and a cosine quantization technique for Hamming space similarity preservation. Extensive empirical studies show that AGAH outperforms several state-of-the-art methods for cross-modal retrieval.

Cited by Powered by Scopus

Deep Multi-View Enhancement Hashing for Image Retrieval

325Citations
112Readers
Get full text

A Decade Survey of Content Based Image Retrieval Using Deep Learning

201Citations
267Readers
Get full text

Joint-modal Distribution-based Similarity Hashing for Large-scale Unsupervised Deep Cross-modal Retrieval

155Citations
31Readers

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Gu, W., Gu, X., Gu, J., Li, B., Xiong, Z., & Wang, W. (2019). Adversary guided asymmetric hashing for cross-modal retrieval. In ICMR 2019 - Proceedings of the 2019 ACM International Conference on Multimedia Retrieval (pp. 159–167). Association for Computing Machinery, Inc. https://doi.org/10.1145/3323873.3325045

Readers over time

‘19‘20‘21‘22‘24‘250481216

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 16

76%

Lecturer / Post doc 3

14%

Researcher 2

10%

Readers' Discipline

Tooltip

Computer Science 17

74%

Engineering 5

22%

Economics, Econometrics and Finance 1

4%

Save time finding and organizing research with Mendeley

Sign up for free
0