Given a stream of points in a metric space, is it possible to maintain a constant approximate clustering by changing the cluster centers only a small number of times during the entire execution of the algorithm? This question received attention in recent years in the machine learning literature and, before our work, the best known algorithm performs Oe(k2) center swaps (the Oe(·) notation hides polylogarithmic factors in the number of points n and the aspect ratio ∆ of the input instance). This is a quadratic increase compared to the offline case - the whole stream is known in advance and one is interested in keeping a constant approximation at any point in time - for which Oe(k) swaps are known to be sufficient and simple examples show that Ω(k log(n∆)) swaps are necessary. We close this gap by developing an algorithm that, perhaps surprisingly, matches the guarantees in the offline setting. Specifically, we show how to maintain a constant-factor approximation for the k-median problem by performing an optimal (up to polylogarithimic factors) number Oe(k) of center swaps. To obtain our result we leverage new structural properties of k-median clustering that may be of independent interest.
CITATION STYLE
Fichtenberger, H., Lattanzi, S., Norouzi-Fard, A., & Svensson, O. (2021). Consistent k-clustering for general metrics. In Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms (pp. 2660–2678). Association for Computing Machinery. https://doi.org/10.1137/1.9781611976465.158
Mendeley helps you to discover research relevant for your work.