I am writing a quick & dirty C program to find the first three eigenvectors of a quite large system of points with 512 feature dimensions each. Data is all real.
I find the first eigenvector $u_1$ by constructing the Cov matrix $\Sigma$ from the set of points, then using power iteration to find a quick approximation.
The challenge then comes in finding an approximation to the second (and third) eigenvectors.
A brute-force method is to go back to the original data, flatten each point along the new $u_1$ by $v' = v - (v \cdot u_1)u_1$ with unit $u_1$, then recompute the new covariance matrix $\Sigma$' and proceed from there.
It does seem like there should be a way to directly flatten the Cov matrix $\Sigma$ itself, though, along one of its degrees of freedom, without having to go back to the data. Can't find the formula though.
I was guessing that one could perhaps treat the Cov matrix as a partition of vertical vectors, then use the previous subtraction on each of them, but I'm leery of this result. Perhaps something similar to $\Sigma - u u^T$?
Note this is flattening the space along a single dimension into (d-1) d.o.f., not flattening the entire space into a single (1) d.o.f. line. The new $\Sigma$' should still be 512x512.
How to compute a flattened Cov matrix $\Sigma$' from a previous matrix $\Sigma$ and the primary eigenvector $u_1$?