Deep-hashing methods have drawn significant attention during the past years in the field of remote sensing (RS) owing to their prominent capabilities for capturing the semantics from complex RS scenes and generating the associated hash codes in an end-to-end manner. Most existing deep-hashing methods exploit pairwise and triplet losses to learn the hash codes with the preservation of semantic-similarities which require the construction of image pairs and triplets based on supervised information (e.g., class labels). However, the learned Hamming spaces based on these losses may not be optimal due to an insufficient sampling of image pairs and triplets for scalable RS archives. To solve this limitation, we propose a new deep-hashing technique based on the class-discriminated neighborhood embedding, which can properly capture the locality structures among the RS scenes and distinguish images class-wisely in the Hamming space. An extensive experimentation has been conducted in order to validate the effectiveness of the proposed method by comparing it with several state-of-the-art conventional and deep-hashing methods. The related codes of this article will be made publicly available for reproducible research by the community.
CITATION STYLE
Kang, J., Fernandez-Beltran, R., Ye, Z., Tong, X., & Plaza, A. (2020). Deep Hashing Based on Class-Discriminated Neighborhood Embedding. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 13, 5998–6007. https://doi.org/10.1109/JSTARS.2020.3027954
Mendeley helps you to discover research relevant for your work.