Suppose we have $n$ samples each containing $p$ features arranged into a matrix $X \in \mathbb{R}^{n \times p}$. We focus on the high-dimensional setting where $p >> n$. By definition, the covariance matrix of $X$ is $\Sigma = \frac{1}{n} (X - \bar{X})^T(X - \bar{X}) \in \mathbb{R}^{p \times p}$ where $\bar{X} = \frac{1}{n}\sum_{i=1}^n X_i \in \mathbb{R}^p$.
I need to find $R \in \mathbb{R}^{n \times k}$ where $k << p$ such that $R^TR$ is the best rank-$k$ approximation to $\Sigma$ under $\ell_2$ norm. You may additionally assume $k < n$ if that helps. An obvious solution is to perform SVD or Cholesky decomposition on $\Sigma$, but even forming an $p$-by-$p$ matrix is prohibitively expensive, let alone decompose it. Preferably, the procedure to find $R$ should have linear time and space complexity in terms of $p$.
In case this problem is intractable, I will allow $R$ to be a good rank-$k$ approximation, i.e. it does not need to be "the best". While I prefer a mathematically elegant solution, you can use techniques like stochastic gradient descent if that's necessary.
Do a QR decomposition $QR=(X-\bar X)^T$. Then the essential part of $R$ is an $n\times n$ matrix. Assume for further purposes that the QR was carried out in the small format, so that $R$ is square and the partial orthogonal matrix $Q$ has the format of $X^T$.
Then compute the SVD of the triangular factor $USV^T=R$ in dimension $n$. In total $(X-\bar X)^T=(QU)SV^T$ gives the SVD of the $X-\bar X$ and $\Sigma = (QU)S^2(QU)^T$ gives the eigen=singular decomposition of the covariance matrix. Now continue your dimensional analysis with the entries of $S$.
Note that an actual computation of $\Sigma$ is not necessary to get $S$ and the vectors for the low-rank version of it.