Semi-Supervised Metric Learning: A Deep Resurrection

7Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Distance Metric Learning (DML) seeks to learn a discriminative embedding where similar examples are closer, and dissimilar examples are apart. In this paper, we address the problem of Semi-Supervised DML (SSDML) that tries to learn a metric using a few labeled examples, and abundantly available unlabeled examples. SSDML is important because it is infeasible to manually annotate all the examples present in a large dataset. Surprisingly, with the exception of a few classical approaches that learn a linear Mahalanobis metric, SSDML has not been studied in the recent years, and lacks approaches in the deep SSDML scenario. In this paper, we address this challenging problem, and revamp SSDML with respect to deep learning. In particular, we propose a stochastic, graph-based approach that first propagates the affinities between the pairs of examples from labeled data, to that of the unlabeled pairs. The propagated affinities are used to mine triplet based constraints for metric learning. We impose orthogonality constraint on the metric parameters, as it leads to a better performance by avoiding a model collapse.

Cite

CITATION STYLE

APA

Dutta, U. K., Harandi, M., & Sekhar, C. C. (2021). Semi-Supervised Metric Learning: A Deep Resurrection. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 8B, pp. 7279–7287). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i8.16894

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free