So I have a statistical learning algorithm in which D is a diagonal matrix that changes each iteration while A stays the same. I'm looking for a fast way to invert ADA' each iteration which ends up being a .9 million by .9 million sized matrix.
A is m by n with m < n.
My thoughts have been drawn to doing an economic SVD on A to get A=SVU' (and V ends up being a square diagonal matrix) at which point I only need to worry about inverting the inner U'DU term and U'*U=I, I feel like there should be something possible but I can't figure it out.
Any ideas?
some additional notes: A is fairly sparse, preprocessing things like the SVD of A can take as long as necessary...etc
MATLAB code that i've tested to show that the suggested approach doesn't work (unsure how to format this):
rows=90;
cols=120;
G=rand(rows,cols);
[X,Y,Z]=svd(G,'econ');
A=Z'; %the above was just to generate A s.t. A'A=I, as in second paragraph above
d=rand(1,cols);
D=diag(d);
M=A*D*A';
Minv=M^(-1) %something to compare with
%[U,E,V]=svd(A); %also tried this, it didn't work either
[U,E,V]=svd(A,'econ');
Einv=E';
Einv(1:rows,1:rows)=diag(1./diag(E)); %calculate inverse of E
Ap=V*Einv*U';
Minv2=Ap'*D^(-1)*Ap;
max(max((Minv-Minv2).^2)) %did it work? (no)
How about: $$(ADA')^{-1} = {A^+}' D^{-1} A^+$$ where $A^+$ is the Moore-Penrose pseudoinverse?
Note that if $A=U\Sigma V'$ is the economic SVD, then $A^+=V\Sigma^{-1}U'$.
Btw, I'm assuming that your matrices are real, that $A$ is of rank $m$, and $D$ of rank $n$.
Edit: Turns out that this doesn't work. I don't have a working solution (yet), only the following observations.
Let $\boxed{\quad A\phantom'\quad}=\boxed{U\phantom'}\,\boxed{\Sigma\phantom'}\,\boxed{\quad V'\quad}$ be the economic SVD.
Then we can write the full SVD as $\boxed{\quad A\phantom'\quad}=\boxed{U\phantom'}\,\boxed{\Sigma\phantom'\quad 0\phantom'}\begin{array}{|cc|}\hline V'\\ \hline \quad\! N'\!\!\quad\\\hline\end{array}$.
Moreover, the columns of $U$ form the column space of $A$. And the columns of $N$ form the null space of $A$.
Consequently we can write either: $$(ADA')^{-1} = (U\Sigma V'DV\Sigma' U')^{-1}=U\Sigma^{-1}(V'DV)^{-1}\Sigma^{-1}U'$$ or: $$(ADA')^{-1} = \left(U(\Sigma\mid0)\binom{V'}{N'}D(V\mid N)\binom{\Sigma'}{0} U'\right)^{-1} = U\left((\Sigma\mid0)\binom{V'}{N'}D(V\mid N)\binom{\Sigma}{0}\right)^{-1}U'$$ In the second form we can say that: $$\left(\binom{V'}{N'}D(V\mid N)\right)^{-1} = \binom{V'}{N'}D^{-1}(V\mid N) $$ However, unfortunately we cannot leave out the $N$ submatrices and we have: $$(V'DV)^{-1}\ne V'D^{-1}V $$ This is why my original suggestion does not work.