I am looking at Bishop's pattern recognition book. I am a bit rusty on my linear algebra now, hope you can help me with the following.
The author says that the covariance matrix can be expressed as:
$$\Sigma = \sum_{i=1}^D\lambda_iu_iu_i^T$$
But I cannot quite get to that result. I am getting something else, which I doubt is correct, as I am expressing $\Sigma$ in terms of just one eigenvector. Note, that $\Sigma$ is $D$-dimensional.
Actually, I just realized that what I did makes no sense. So I am not even sure where to start now.
I was using initially the fact that
$$\Sigma u_i = \lambda_i u_i$$
but you cannot take any transposes of this equation, that would not make sense.
EDIT: I guess if I use matrix notation:
$$\Sigma U = \lambda U$$
it is a bit clearer, that I have to show that $\sum_{i=1}^D u_iu_i^T = U$. I am afraid I may be talking nonsense. No... that is not possible...
Consider matrix $Z = \Sigma - \Sigma_{i=1}^D\lambda_iu_i$. For any $1 \le j \le D$, $$Zu_j = \Sigma u_j - \sum_{i=1}^D\lambda_iu_iu_i^Tu_j = \lambda_ju_j - \lambda_ju_j = 0$$, since $u_i$ forms a set of basis in $\mathbb{R}^D$, thus $Z = 0$, and $$\Sigma = \sum_{i=1}^D\lambda_iu_iu_i^T$$.
Another way to look at this is the eigenvalue decomposition of real symmetric matrix. $$\Sigma = U\Lambda U^T,$$ where matrix $U$ is formed by the eigenvectors: $U = [u_1 u_2 \ldots u_D]$, and $\Lambda$ is a diagonal matrix of the eigenvalues $\Lambda = \text{diag}(\lambda_1, \lambda_2, \ldots \lambda_D)$. Thus $$\Sigma = U\Lambda U^T = \sum_{i=1}^D\lambda_iu_iu_i^T $$