I'm currently struggling with an apparently obvious conclusion.
Assuming there is a random vector x with already zero mean and covariance Matrix $\Sigma$. The Eigenvalue composition is $U\Lambda U^T$.
When I want to whiten this vector, there are always the same strategies in all articles I read about this topic. Define a vector $y=\Lambda^{-1/2}Ux$. And then they all say: its easy to see, that $E(yy^T)=I$. And I just can't really see it.
If I write it down, I come to:
$$E(yy^T)=E(\Lambda^{-1/2}Uxx^TU^T\Lambda^{-1/2})$$ What exactly is the next step I have to do? I know, the rows/coloums of U are the Eigenvectors. But I can't manage to figure out the argument/next transformation, why this should be the unit matrix.
Could u help me with that?
I think I already understood the sense of Whitening, but I'm currently not sure why this is so trivial.
Best regards