Given a matrix $A\in R^{n\times d}$ with $n>d$, and we can have some fast ways to (approximately) calculate the SVD (Singular Value Decomposition) of $A$, saying $A=USV^T$ and $V\in R^{d\times d}$. It is straightforward to know that the SVD of $2AA^T$ is $U(2SS)V^T$, that is to say the SVD of $2AA^T$ requires $O(nd^2)$ time similar to that of $A$.
However, to get the SVD of $2AA^T-\operatorname{diag}(AA^T)\in R^{n\times n}$ where $\operatorname{diag}(AA^T)$ is a square diagonal matrix who only has the diagonal elements of $AA^T$ in its diagonal, if running SVD directly on $2AA^T-\operatorname{diag}(AA^T)$, we might need $O(n^3)$ time. My question is, do you have any method or equation to use $A$ and its SVD $USV^T$ to indirectly get the SVD of $2AA^T-\operatorname{diag}(AA^T)$? Many thanks for your help.
This should be a comment, but is slightly long for a comment and may also qualify as an answer.
In general, diagonal updates are hard to handle for a general matrix. It is known (and can be proved) that diagonal updates to LU is as hard as performing the LU on the updated matrix, i.e., if $A=LU$ and $D$ is a diagonal matrix, then the $LU$ factorisation of $A+D$ again costs $\mathcal{O}(n^3)$ (and in fact, you can't even gain on the constants in front of the scaling). On some googling, a short proof of this is available here. I would expect the same to be true for SVD.