For symmetric, invertible $p$ x $p$ matrices $V$ and $A$, show that $(V^{-1} + A^{-1})^{-1}= A(V+A)^{-1}V = V(V+A)^{-1}A$

54 Views Asked by At

I am trying to find an expression for E($\beta| \hat{\beta}$) in Bayesian Regression and showing that for symmetric, invertible $p$ x $p$ matrices $V$ and $A$,

$(V^{-1} + A^{-1})^{-1}= A(V+A)^{-1}V = V(V+A)^{-1}A$ will be useful.

My question isn't so much how to do the problem as it is what are useful properties of inverses that I will need to show this statement is true?

EDIT:

It is assumed that the sum is invertible.

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

Since the dimensions are finite a right-inverse is also a left-inverse. Now manually multiply $(V^{-1}+A^{-1})$ by what you think it's inverse is and it should be $I$.

$\begin{eqnarray} V(V+A)^{-1}A(V^{-1}+A^{-1}) =&V(V+A)^{-1}(AV^{-1}+I)\\ =&V(V+A)^{-1}(AV^{-1}+I)VV^{-1}\\ =&V(V+A)^{-1}(A+V)V^{-1}\\ =&VV^{-1}\\=&I \end{eqnarray}$

Similarly,

$\begin{eqnarray} A(V+A)^{-1}V(V^{-1}+A^{-1}) =&A(V+A)^{-1}(I+VA^{-1})\\ =&A(V+A)^{-1}(I+VA^{-1})AA^{-1}\\ =&A(V+A)^{-1}(A+V)A^{-1}\\ =&AA^{-1}\\=&I \end{eqnarray}$

Of course as mentioned in the comments this only works if everything if $A,V, (A+V)$ are all invertible :)