I am studying the k-means clustering algorithm.
Suppose, I have 2 clusters and the partition vector $I$ is a binary vector indicating whether an observation belongs to $k^{th}$ cluster. Further, suppose, $I^{\star}$ is the optimal partition vector for which the performance function $S(I^{\star})$ reaches the global minimum. I would like to learn the behavior of the partition vectors around the global optimum. How could I prove that around global optimum $||I^{\star}-I_1||_2 \leq ||I^{\star}-I_2||_2$ if $S(I_1) \leq S(I_2)$.