The nearest neighbor search (NNS) problem is the following: Given a set of n points P = {p1,⋯, pn} in some metric space X, preprocess P so as to efficiently answer queries which require finding a point in P closest to a query point q ∈ X. The approximate nearest neighbor search (c-NNS) is a relaxation of NNS which allows to return any point within c times the distance to the nearest neighbor (called c-nearest neighbor). This problem is of major and growing importance to a variety of applications. In this paper, we give an algorithm for (4⌈log1+ρ log4d⌉+1)-NNS algorithm in l∞d with O(dn1+ρ logO(1) n) storage and O(d logO(1) n) query time. Moreover, we obtain an algorithm for 3-NNS for l∞ with nlogd+1 storage. The preprocessing time is close to linear in the size of the data structure. The algorithm can be also used (after simple modifications) to output the exact nearest neighbor in time bounded by O(d logO(1) n) plus the number of (4 ⌈log1+ρ log 4d⌉+1)-nearest neighbors of the query point. Building on this result, we also obtain an approximation algorithm for a general class of product metrics. Finally, we show that for any c < 3 the c-NNS problem in l∞ is provably as hard as the subset query problem (also called the partial match problem). This indicates that obtaining a sublinear query time and subexponential (in d) space for c < 3 might be hard.
CITATION STYLE
Indyk, P. (2001). On approximate nearest neighbors under I∞ norm. Journal of Computer and System Sciences, 63(4), 627–638. https://doi.org/10.1006/jcss.2001.1781
Mendeley helps you to discover research relevant for your work.