Inverse of the sum of matrices

391.4k Views Asked by At

I have two square matrices: $A$ and $B$. $A^{-1}$ is known and I want to calculate $(A+B)^{-1}$. Are there theorems that help with calculating the inverse of the sum of matrices? In general case $B^{-1}$ is not known, but if it is necessary then it can be assumed that $B^{-1}$ is also known.

6

There are 6 best solutions below

8
On BEST ANSWER

In general, $A+B$ need not be invertible, even when $A$ and $B$ are. But one might ask whether you can have a formula under the additional assumption that $A+B$ is invertible.

As noted by Adrián Barquero, there is a paper by Ken Miller published in the Mathematics Magazine in 1981 that addresses this.

He proves the following:

Lemma. If $A$ and $A+B$ are invertible, and $B$ has rank $1$, then let $g=\operatorname{trace}(BA^{-1})$. Then $g\neq -1$ and $$(A+B)^{-1} = A^{-1} - \frac{1}{1+g}A^{-1}BA^{-1}.$$

From this lemma, we can take a general $A+B$ that is invertible and write it as $A+B = A + B_1+B_2+\cdots+B_r$, where $B_i$ each have rank $1$ and such that each $A+B_1+\cdots+B_k$ is invertible (such a decomposition always exists if $A+B$ is invertible and $\mathrm{rank}(B)=r$). Then you get:

Theorem. Let $A$ and $A+B$ be nonsingular matrices, and let $B$ have rank $r\gt 0$. Let $B=B_1+\cdots+B_r$, where each $B_i$ has rank $1$, and each $C_{k+1} = A+B_1+\cdots+B_k$ is nonsingular. Setting $C_1 = A$, then $$C_{k+1}^{-1} = C_{k}^{-1} - g_kC_k^{-1}B_kC_k^{-1}$$ where $g_k = \frac{1}{1 + \operatorname{trace}(C_k^{-1}B_k)}$. In particular, $$(A+B)^{-1} = C_r^{-1} - g_rC_r^{-1}B_rC_r^{-1}.$$

(If the rank of $B$ is $0$, then $B=0$, so $(A+B)^{-1}=A^{-1}$).

3
On

Assuming everything is nicely invertible, you are probably looking for the SMW identity (which, i think, can also be generalized to pseudoinverses if needed)

Please see caveat in the comments below; in general if $B$ is low-rank, then you'd be happy using SMW.

3
On

A formal power series expansion is possible: $$ \begin{eqnarray} (A + \epsilon B)^{-1} &=& \left(A \left(I + \epsilon A^{-1}B\right)\right)^{-1} \\ &=& \left(I + \epsilon A^{-1}B\right)^{-1} A^{-1} \\ &=& \left(I - \epsilon A^{-1}B + \epsilon^2 A^{-1}BA^{-1}B - ...\right) A^{-1} \\ &=& A^{-1} - \epsilon A^{-1} B A^{-1} + \epsilon^2 A^{-1} B A^{-1} B A^{-1} - ... \end{eqnarray} $$ Under appropriate conditions on the eigenvalues of $A$ and $B$ (such that $A$ is sufficiently "large" compared to $B$), this will converge to the correct result at $\epsilon=1$.

0
On

$(A+B)^{-1} = A^{-1} - A^{-1}BA^{-1} + A^{-1}BA^{-1}BA^{-1} - A^{-1}BA^{-1}BA^{-1}BA^{-1} + \cdots$

provided $\|A^{-1}B\|<1$ or $\|BA^{-1}\| < 1$ (here $\|\cdot\|$ means norm). This is just the Taylor expansion of the inversion function together with basic information on convergence.

(posted essentially at the same time as mjqxxx)

5
On

This I found accidentally.

Suppose given $A$, and $B$, where $A$ and $A+B$ are invertible. Now we want to know the expression of $(A+B)^{-1}$ without imposing the all inverse. Now we follow the intuition like this. Suppose that we can express $(A+B)^{-1} = A^{-1} + X$, next we will present simple straight forward method to compute $X$ \begin{equation} (A+B)^{-1} = A^{-1} + X \end{equation} \begin{equation} (A^{-1} + X) (A + B) = I \end{equation} \begin{equation} A^{-1} A + X A + A^{-1} B + X B = I \end{equation} \begin{equation} X(A + B) = - A^{-1} B \end{equation} \begin{equation} X = - A^{-1} B ( A + B)^{-1} \end{equation} \begin{equation} X = - A^{-1} B (A^{-1} + X) \end{equation} \begin{equation} (I + A^{-1}B) X = - A^{-1} B A^{-1} \end{equation} \begin{equation} X = - (I + A^{-1}B)^{-1} A^{-1} B A^{-1} \end{equation}

This lemma is simplification of lemma presented by Ken Miller, 1981

1
On

By means of augmented matrix,

A+B|I

Left times A⁻¹:

I+A⁻¹B|A^-1

abstract the common factor B:

(B^-1+A^-1)B|A^-1

left times (B^-1+A^-1)^-1:

B|(B^-1+A^-1)⁻¹A^-1

left times B^-1:

I|B^-1(B^-1+A^-1)⁻¹A^-1

thus (A+B)⁻¹=B⁻¹(B⁻¹+A⁻¹)⁻¹A⁻¹