Abstract
Retrieval of relevant vectors produced by representation learning critically influences the efficiency in natural language processing (NLP) tasks. In this paper we demonstrate an efficient method for searching vectors via a typical non-metric matching function: inner product. Our method, which constructs an approximate Inner Product Delaunay Graph (IPDG) for top-1 Maximum Inner Product Search (MIPS), transforms retrieving the most suitable latent vectors into a graph search problem with great benefits of efficiency. Experiments on data representations learned for different machine learning tasks verify the outperforming effectiveness and efficiency of the proposed IPDG.
Cite
CITATION STYLE
Tan, S., Zhou, Z., Xu, Z., & Li, P. (2019). On efficient retrieval of top similarity vectors. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 5236–5246). Association for Computational Linguistics. https://doi.org/10.18653/v1/D19-1527
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.