Say I have the, covariance matrix $\mathbf{S}=\sum_{i=1}^{N}\left(\mathbf{x}_{i}-\boldsymbol{\mu}\right)\left(\mathbf{x}_{i}-\boldsymbol{\mu}\right)^{T}$
and a set of basis vectors $\mathbf{u}_{j}$ in a matrix $\mathbf{U}=\left[\mathbf{u}_{1}, \ldots, \mathbf{u}_{d}\right]$
How do I proove:
If $\mathbf{U}^{T} \mathbf{SU}=\mathbf{0}$ prove that $\mathrm{U}$ projects all observations $\mathrm{x}_{i}$ to the same point, i.e. $\mathbf{U}^{T} \mathbf{x}_{i}=\mathbf{U}^{T} \mathbf{x}_{j}$ for all $i, j$
I'm a bit stuck with this, I know it is probably obvious.
My attempt is to write S as $\mathbf{S} = \mathbf{\overline{X}} \mathbf{\overline{X}^{T}}$ and $\mathbf{Y} = \mathbf{U}^{T}\mathbf{\overline{X}}$ the projection. $\mathbf{U}^{T} \mathbf{SU}=\mathbf{Y}\mathbf{Y}^{T}= \mathbf{0}$ So therefore this projected covariance matrix is completely degenerate , so there is no information from this projection. But I don't get how to show the above.
Does $\mathbf{U}^{T} \mathbf{x}_{i}$ = 0 vector, for every i?
If you let $y_i=U^T x_i$, then $U^T S U = \sum_{i=1}^N (y_i-\nu)(y_i-\nu)^T$ where $\nu$ is the mean of the $y_i$s (which is exactly $U^T \mu$). This means that the covariance of the $y_i$ is $0$.
So your statement is true whenever the following is true :
The fact that $S_{j,j}=0$ implies that $\sum_{i=1}^N (x_{i,j}-\mu_j)^2 = 0$ hence $x_{i,j}=\mu_j$ for all $i$.
There is probably a cleaner proof of that last fact.