The whole text was too long to fit in the title. I found this statement without much of a citation or proof:
"The sum of the squared distances from every point to the centroid is equal to sum of the squared distances from each point to each other point, divided by the number of points."
It indeed works with arbitrary toy examples (triangle, square), but I would like to know whether this holds to high dimensional vector spaces, such as a 200 dimensional space filled with word vectors. Furthermore, does it hold with distance or similarity metrics other than euclidean, such as cosine, dot, canberra or bray-curtis?
The centroid of a collection of $n$ vectors is just their average: $$\tilde v = \frac 1n \sum_{i = 1}^n v_i$$
To make matters a little easier, we can replace the $v_i$ with $w_i = v_i - \tilde v$, then the $w_i$ form a collection whose centroid happens to be $0$. I.e., $\sum_{i = 1}^n w_i = 0$.
The "squared distance" between two points $u, v$ is the square of the norm of their difference $\|u - v\|^2$ which in an inner-product space is given by the inner product of the vector with itself: $$\begin{align}\|u - v\|^2 &= \langle u-v, u-v\rangle\\&=\langle u, u\rangle - \langle u, v\rangle-\langle v, u\rangle + \langle v, v\rangle\end{align}$$ If the vector space has the real numbers as its scalar field, then $\langle u, v\rangle=\langle v, u\rangle$, and we have $$\|u - v\|^2 = \|u\|^2 + \|v\|^2 - 2\langle u, v\rangle$$
The sum of the distances of all the $w_i$ from each other (which is also the sum of the distances of all the $v_i$ from each other) is given by $\frac 12\sum_{i=1}^n\sum_{j=1}^n \|w_i -w_j\|^2$, where the $\frac 12$ is because each pair of points occurs twice in this double sum. But $$\begin{align}\frac 12\sum_{i=1}^n\sum_{j=1}^n \|w_i -w_j\|^2 &= \frac 12\sum_{i=1}^n\sum_{j=1}^n\bigg( \|w_i\|^2 +\|w_j\|^2 - 2\langle w_i, w_j\rangle\bigg)\\ &=\frac n2\sum_{i=1}^n \|w_i\|^2 + \frac n2\sum_{j=1}^n\|w_j\|^2 - \sum_{i=1}^n\sum_{j=1}^n \langle w_i, w_j\rangle\\ &=n\sum_{i=1}^n \|w_i\|^2 - \sum_{i=1}^n \left\langle w_i, \sum_{j=1}^nw_j\right\rangle\\ &=n\sum_{i=1}^n \|w_i\|^2 - \sum_{i=1}^n\left\langle w_i, 0\right\rangle\\ &= n\sum_{i=1}^n \|w_i\|^2\end{align}$$
So the sum of the squared distances of the $w_i$ from their centroid $0$ is the sum of the squared distances between all the pairs of points divided by their number $n$.
Note that this is very much a property of vector space inner products. It is not going to be true for vector space norms that do not come from inner products. And it is most definitely not going to be true of "metrics" in general. Note also that I didn't make any reference to the dimension of the vector space. It holds in any number of dimensions, including infinite dimensional vector spaces - provided that distance is calculated by an inner product norm. Though I choose to assume a real vector space, the same proof also works for complex vector spaces.
I am not going bother researching these other "metrics" to figure out what they are. Unless they are obtained from an inner product on a vector space by $d(u,v) = \sqrt{\langle u-v, u-v\rangle}$, there is no reason to believe this will work for them. Don't let a similarity in naming conventions lead you astray.