Latest YouTube Video

Wednesday, March 30, 2016

Local Search Yields a PTAS for k-Means in Doubling Metrics. (arXiv:1603.08976v1 [cs.DS])

The most well known and ubiquitous clustering problem encountered in nearly every branch of science is undoubtedly $k$-means: given a set of data points and a parameter $k$, select $k$ centres and partition the data points into $k$ clusters around these centres so that the sum of squares of distances of the points to their cluster centre % (called the cost of the solution) is minimized. Typically these data points lie in Euclidean space $\mathbb{R}^d$ for some $d\geq 2$.

The most commonly used algorithm in practice is known as Lloyd-Forgy, which is also referred to as "the" $k$-means algorithm, and various extensions of it often work very well in practice. However, they may produce solutions whose cost is arbitrarily large compared to the optimum solution. Kanungo et al. [2004] analyzed a very simple local search heuristic to get a polynomial-time algorithm with approximation ratio $9+\epsilon$ for any fixed $\epsilon>0$ for $k$-means in Euclidean space.

Finding an algorithm with a better worst-case approximation guarantee has remained one of the biggest open questions in this area, in particular whether one can get a true PTAS for fixed dimension Euclidean space. We settle this problem by showing that a simple local search algorithm provides a PTAS for $k$-means for $\mathbb{R}^d$ for any fixed $d$.

More precisely, for any error parameter $\epsilon>0$, the local search algorithm that considers swaps of up to $\rho=d^{O(d)}\cdot{\epsilon}^{-O(d/\epsilon)}$ centres will produce a solution whose cost is at most $1+\epsilon$ times greater than the optimum cost. Our analysis extends very easily to the more general setting where the metric has fixed doubling dimension and to where we are interested in minimizing the sum of the $q$-th powers of the distances for fixed $q$.

Donate to arXiv



from cs.AI updates on arXiv.org http://ift.tt/25xprKU
via IFTTT

No comments: