Learning mahalanobis distance metric: Considering instance disturbance helps

15Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

Mahalanobis distance metric takes feature weights and correlation into account in the distance computation, which can improve the performance of many similarity/dissimilarity based methods, such as kNN. Most existing distance metric learning methods obtain metric based on the raw features and side information but neglect the reliability of them. Noises or disturbances on instances will make changes on their relationships, so as to affect the learned metric. In this paper, we claim that considering disturbance of instances may help the metric learning approach get a robust metric, and propose the Distance metRIc learning Facilitated by disTurbances (Drift) approach. In Drift, the noise or the disturbance of each instance is learned. Therefore, the distance between each pair of (noisy) instances can be better estimated, which facilitates side information utilization and metric learning. Experiments on prediction and visualization clearly indicate the effectiveness of Drift.

Cite

CITATION STYLE

APA

Ye, H. J., Zhan, D. C., Si, X. M., & Jiang, Y. (2017). Learning mahalanobis distance metric: Considering instance disturbance helps. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 3315–3321). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/463

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free