Let $X$ be a real $n \times p$ data matrix where we can assume $n > p$. The SVD decomposition is $X = UDV'$ where $U$ is $n \times p$, $U'U = I_p$, $D$ is $p \times p$ diagonal with singular values on diagonal, and $V$ is $p \times p$ and $V'V = VV' = I_p$.
If $X$ is centered, then its (sample-sense) covariance matrix (without dividing by $n - 1$) is $\Sigma = X'X = VD^2V'$, and its inverse is $\Sigma^{-1} = VD^{-2}V'$. This is true because:
$\Sigma^{-1}\Sigma = VD^{-2}V'VD^2V' = VV' = I$
and:
$\Sigma\Sigma^{-1} = VD^2V'VD^{-2}V' = VV' = I$
Suppose now that $X = TW'$ where $T$ is $n \times d$ and $d \ll p$ (lower dimensional space) and $W$ is some $p \times d$ linear operator. We know from PCA that the solution for $W$ is the first $d$ right singular vectors of $X$ or the first $d$ columns of $V$. And $X$ is in fact:
$X_{n \times p} = U_{n \times d} D_{d \times d} V'_{d \times p}$
Now the covariance matrix of $X$ (assume all above) is "still" of the form:
$\Sigma^* = V_{p \times d}D_{d \times d}U'_{d \times n}U_{n \times d} D_{d \times d} V'_{d \times p} = V_{p \times d}D^2_{d \times d}V'_{d \times p}$
(notice $U'_{d \times n}U_{n \times d} = I_d$)
But its inverse cannot be written in such a compact way, right? Let $A = V_{p \times d}D^{-2}_{d \times d}V'_{d \times p}$. It is not $\Sigma^*$'s inverse because:
$A\Sigma^* = V_{p \times d}D_{d \times d}^{-2}V'_{d \times p}V_{p \times d}D^2V'_{d \times p} = V_{p \times d}V'_{d \times p} \neq I_p$
Because after truncation $V_{p \times d}V'_{d \times p} \neq I_p$
Question: is there anything interesting to say about $\Sigma^{*-1}$ in terms of SVD matrices?
For example, I have noticed that the higher the latent dimension $d$ (up to $p$ here) the "closer" $V_{p \times d}V'_{d \times p}$ to $I_p$, so I would imagine there is some way to formulate how "far" is $A=V_{p \times d}D^{-2}_{d \times d}V'_{d \times p}$ from $\Sigma^{*-1}$.
Update:
Well, got no answers to this question. But I think the most obvious "interesting" thing I have learned about the inverse of $\Sigma^*$ is that it doesn't exist! Because $\Sigma^*$ is rank deficient.
Normally I would delete the question, but since I think it is well put, there is an answer even if it's not what I expected, and it could help others - I'm not deleting it.