Given that $x_i \in \mathbb{R}^d$,and$$m_1=\frac{1}{N}\sum\limits_{i}||x_i-\mu||_2^2$$$$m_2=\frac{1}{N^2}\sum\limits_{i}\sum\limits_{j}||x_i-x_j||_2^2$$
where $\mu=\frac{1}{N}\sum\limits_{i}x_i$
I am trying to solve for whether $m_1=m_2$ or $2m_1=m_2$.
Here's what I've tried: $$m_1 =\frac{1}{N}\sum\limits_{i}||x_i-\frac{1}{N}x_i||_2^2 =\frac{1}{N}\sum\limits_{i}||\frac{1}{N}(Nx_i-x_i)||_2^2 =\frac{1}{N^3}\sum\limits_{i}||(N-1)x_i||_2^2 $$ $$m_2=\frac{1}{N^2}\sum\limits_{i}\sum\limits_{j}||x_i-x_j||_2^2$$
I do not know if my steps for $m_1$ is correct and how I should proceed with simplifying $m_2$ so that there's only summation with $i$ terms. Can someone please help me with this? Thank you.
$ \def\bb{\mathbb} \def\e{\varepsilon} \def\l{\left} \def\r{\right} \def\o{h} \def\p{\partial} \def\lr#1{\l(#1\r)} \def\trace#1{\operatorname{Tr}\lr{#1}} \def\grad#1#2{\frac{\p #1}{\p #2}} \def\c#1{\color{red}{#1}} \def\m#1{\left[\begin{array}{r}#1\end{array}\right]} $Let $\{\e_k\}$ denote the standard basis for ${\bb R}^n$, then given a matrix $$\eqalign{ X &= \m{x_1&x_2&\ldots&x_n} \in{\bb R}^{m\times n} \\ }$$ we can calculate the following quantities $$\eqalign{ \o &= \sum_{k=1}^n \e_k & \big({\rm all\,ones\,vector}\big) \\ I &= \sum_{k=1}^n \e_k\e_k^T & \big({\rm identity\,matrix}\big) \\ x_k &= X\e_k & \big(k^{th}\,{\rm column\,of}\,X\big) \\ \mu &= \frac 1n \sum_{k=1}^n x_k = \frac 1nX \lr{\sum_{k=1}^n \e_k} = \frac 1nX\o \quad & \big({\rm mean\,of\,cols\,of}\,X\big) \\ M &= \m{\mu&\mu&\ldots&\mu} \;=\; \mu\o^T & \big({\rm matrix\,of\,the\,means}\big) \\ \mu &= Me_k & \big(k^{th}\,{\rm column\,of}\,M\big) \\ \mu &= \frac 1n \sum_{k=1}^n \mu = \frac 1nM \lr{\sum_{k=1}^n \e_k} = \frac 1nM\o & \big({\rm mean\,of\,cols\,of}\,M\big) \\ M:M &= \mu\o^T:\mu\o^T = \lr{\o:\o}\lr{\mu:\mu} = \c{n\lr{\mu:\mu}} \\ X:M &= X:\mu\o^T = X\o:\mu = \c{n\lr{\mu:\mu}} \\ }$$ where the colon denotes the matrix innner product, i.e. $$\eqalign{ A:B &= \sum_{i=1}^m\sum_{j=1}^n A_{ij}B_{ij} \;=\; \trace{A^TB} \\ A:A &= \big\|A\big\|^2_F \\ }$$ when applied to vectors, it corresponds to the vector inner product (aka dot product).
Now apply these ideas to your first sum $$\eqalign{ \alpha &= \frac 1n \sum_{i=1}^n \|x_i-\mu\|^2 \\ n\alpha &= \sum_{i=1}^n (X-M)\e_i:(X-M)\e_i \\ &= (X-M)^T(X-M):\lr{\sum_{i=1}^n\e_i\e_i^T} \\ &= (X-M)^T(X-M):I \\ &= (X-M):(X-M) \\ &= \lr{X:X} - 2\lr{M:X} + \lr{M:M} \\ &= \lr{X:X} - n\lr{\mu:\mu} \\ \alpha &= \frac 1n\lr{X:X} - \lr{\mu:\mu} \\ }$$ Then to your second sum $$\eqalign{ \beta &= \frac 1{n^2} \sum_{i=1}^n\sum_{j=1}^n \|x_i-x_j\|^2 \\ n^2\beta &= \sum_{i=1}^n\sum_{j=1}^n X(\e_i-\e_j):X(\e_i-\e_j) \\ &= X^TX:\lr{\sum_{i=1}^n\sum_{j=1}^n(\e_i-\e_j)(\e_i-\e_j)^T} \\ &= X^TX:\lr{\sum_{i=1}^n\sum_{j=1}^n \e_i\e_i^T - \e_i\e_j^T - \e_j\e_i^T + \e_j\e_j^T} \\ &= X^TX:\lr{nI - \o\o^T - \o\o^T + nI} \\ &= X^TX:\lr{2nI - 2\o\o^T} \\ &= 2n\lr{X:X} - 2\lr{X\o:X\o} \\ &= 2n\lr{X:X} - 2n^2\lr{\mu:\mu} \\ \beta &= \frac 2n\lr{X:X} - 2\lr{\mu:\mu} \;\doteq\; 2\alpha \\\\ }$$
NB: The properties of the underlying trace function allow the terms in a colon product to be rearranged in many equivalent ways, e.g. $$\eqalign{ A:B &= B:A \\ A:B &= A^T:B^T \\ A:BX &= B^TA:X = AX^T:B \\ }$$