I'm computing PCA of a dataset. My matrix $X$ has dimension $D\times I$, I'm trying to do a dimensionality reduce using PCA so I have $\Phi$ the matrix of the eigenvectors of the covariance matrix.
We can approximate the data by $x = \Phi h$. And we do a dimensionality reduction if we select the fist $K$ columns $x \approx \Phi_{k} h$.
I want to prove that the MSE that I'm doing by this approximation is $\sum^{D}_{i=K+1} \lambda_{i}$ where $\lambda$ are the eigenvalues of the covariance matrix.
My approach is to do this:
$MSE = \sum_{i} (x_{i} - \hat{x}_{i})^{2} = \sum_{i} (\Phi h_{i} - \Phi_{K} h_{i})^{2} = \sum_{i} (\sum^{D}_{j=1} \phi_{ij} h_{j} - \sum^{K}_{j=1} \phi_{ij} h_{j})^{2} = \sum_{i} (\sum^{D}_{j=K+1} \phi_{ij} h_{j})^{2} = \sum^{D}_{j=K+1}( \sum_{i} \phi_{ij} h_{j})^{2}$
So $¿\sum_{i} \phi_{ij} h_{j}^{2} = \lambda_{j}?$
Can anybody help me to discern this?