I recently sat an exam (Stanford's CS229 on machine learning) and I faced the following question.
True or False? K-means is guaranteed to converge to a local minimum of the objective $J(c, \mu) = \sum^n_{i=1} \lVert x(i) − \mu_{c(i)} \rVert^2$ in a finite number of iterations.
I answered that this is false because, as the lecture notes point out, although the K-means algorithm is guaranteed to converge in value space $\mathbb R$ (because our data set is finite and $K$, the number of clusterings, is finite), if there are two clusterings $c^*$ and $c'$ that such that $J$ assigns equal value $r$ to each and the value of $J$ (under the action of the algorithm converges to $r$, then the algorithm may oscillate for ever between clustering $c^*$ and $c'$.
Of course, I am here because I was awarded zero marks and everyone else seems to have answered True and got this right.
In my view, the statement would be true if we say that K-means is guaranteed to converge to a set of local minima of the objective. Or we said something like the value of the objective is guaranteed to converge. Convergence to a local minimum of the objective is only a meaningful statement if we consider the graph of that objective.
Looking for moral support here :)