High-dimensional similarity learning via dual-sparse random projection

3Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

We investigate how to adopt dual random projection for high-dimensional similarity learning. For a high-dimensional similarity learning problem, projection is usually adopted to map high-dimensional features into low-dimensional space, in order to reduce the computational cost. However, dimensionality reduction method sometimes results in unstable performance due to the suboptimal solution in original space. In this paper, we propose a dual random projection framework for similarity learning to recover the original optimal solution from subspace optimal solution. Previous dual random projection methods usually make strong assumptions about the data, which need to be low rank or have a large margin. Those assumptions limit dual random projection applications in similarity learning. Thus, we adopt a dual-sparse regularized random projection method that introduces a sparse regularizer into the reduced dual problem. As the original dual solution is a sparse one, applying a sparse regularizer in the reduced space relaxes the low-rank assumption. Experimental results show that our method enjoys higher effectiveness and efficiency than state-of-the-art solutions.

Cite

CITATION STYLE

APA

Yao, D., Zhao, P., Pham, T. A. N., & Cong, G. (2018). High-dimensional similarity learning via dual-sparse random projection. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 3005–3011). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/417

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free