k-means++: few more steps yield constant approximation

02/18/2020
by   Davin Choo, et al.
0

The k-means++ algorithm of Arthur and Vassilvitskii (SODA 2007) is a state-of-the-art algorithm for solving the k-means clustering problem and is known to give an O(log k)-approximation in expectation. Recently, Lattanzi and Sohler (ICML 2019) proposed augmenting k-means++ with O(k log log k) local search steps to yield a constant approximation (in expectation) to the k-means clustering problem. In this paper, we improve their analysis to show that, for any arbitrarily small constant > 0, with only k additional local search steps, one can achieve a constant approximation guarantee (with high probability in k), resolving an open problem in their paper.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset