Jump to Content

Consistent k-Clustering for General Metrics

Ashkan Norouzi Fard
Hendrik Fichtenberger
Ola Svensson
SODA 2021 (to appear)
Google Scholar

Abstract

Given a stream of points in a metric space, is it possible to maintain a constant approximate clustering by changing the cluster centers only a small number of times during the entire execution of the algorithm? This question received attention in recent years in the machine learning literature and, before our work, the best known algorithm performs $\widetilde{O}(k^2)$ center swaps (the $\widetilde{O}(\cdot)$ notation hides polylogarithmic factors in the number of points $n$ and the aspect ratio $\Delta$ of the input instance). This is a quadratic increase compared to the offline case --- the whole stream is known in advance and one is interested in keeping a constant approximation at any point in time --- for which $\widetilde{O}(k)$ swaps are known to be sufficient and simple examples show that $\Omega(k \log(n \Delta))$ swaps are necessary. We close this gap by developing an algorithm that, perhaps surprisingly, matches the guarantees in the offline setting. Specifically, we show how to maintain a constant-factor approximation for the $k$-median problem by performing an optimal (up to polylogarithimic factors) number $\widetilde{O}(k)$ of center swaps. To obtain our result we leverage new structural properties of $k$-median clustering that may be of independent interest.