Inverse of the sum of a symmetric and diagonal matrices

6.9k Views Asked by At

I have two matrices $A$ and $B$ with quite a few notable properties.

They are both square.

They are both symmetric.

They are the same size.

$A$ has $1$'s along the diagonal and real numbers in $(0 - 1)$ on the off-diagonal.

$B$ has real numbers along the diagonal and $0$'s on the off-diagonal.

So, they look like this:

$$ A= \left[\begin{matrix} 1 & b & ... & z\\ b & 1 & ... & y\\ \vdots & \vdots & \ddots & \vdots \\ z & y & ... & 1 \end{matrix}\right]\\ $$ and $$ B = \left[ \begin{matrix} \alpha & 0 & ... & 0\\ 0 & \beta & 0 & 0\\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & ... & \omega \end{matrix}\right] $$

I need to calculate $(A+\delta B)^{-1}$ many times, with a different value of $\delta$ each time. This can be done directly, but it may be time consuming, depending on the number of $\delta$'s and the size of $A$ and $B$.

If the values along the diagonal of $B$ were $1$, it would be the identity matrix, and it could straightforwardly be co-diagonalized with $A$ so that the inverse of the sum can be calculated by inverting the eigen value. But, alas, that is not the case.

My intuition is that no such matrix algebra shortcut can exist in the scenario under consideration, but I am hopeful that someone can prove me wrong.

edit: I should have provided more information about that. What I really want is a matrix, $M$, such that $MM^{T} = (A + \delta B)^{-1}$. If I can eigen-decompose $A+\delta B$ quickly, then I need only invert the eigen-values ($n$ scalar divisions) and multiply by the eigen vectors ($n$ scalar-vector multiplications) to get $M$.

2

There are 2 best solutions below

1
On BEST ANSWER

This is more a long comment or a remark than an answer, but maybe it can help, at least in certain cases.

Assume $B$ is invertible, and so is $A+\delta B$.

Notice that $$A+\delta B = \left(\frac{1}{\delta}AB^{-1}+1\right)\delta B.$$ Therefore $$(A+\delta B)^{-1} = \frac{1}{\delta}B^{-1}\left(\frac{1}{\delta}AB^{-1}+1\right)^{-1}$$ Now assume that $\left|\det\left(\frac{1}{\delta}AB^{-1}\right)\right|<1$. Then we have $$\left(\frac{1}{\delta}AB^{-1}+1\right)^{-1} = \sum_{n\ge0}(-\delta)^{-n}(AB^{-1})^n.$$ In conclusion, we have $$(A+\delta B)^{-1} = -\sum_{n\ge0}(-\delta)^{-(n+1)}B^{-1}(AB^{-1})^n.$$

1
On

Assume that $\delta$ varies in a set with $p$ elements and $A,B\in M_n(\mathbb{R})$. Then the complexity of the trivial calculation of the $(A+\delta B)^{-1}$ is $O(pn^3)$. Implicitly you write that we can do better when $B=I_n$; do you want to bet ?

Point of view 1. We assume that $A,A+\delta B$ are invertible and that $A^{-1}B$ is diagonalizable: $A^{-1}B=PDP^{-1}$. Let $(\lambda_i)_i=spectrum(A^{-1}B)$. Then $(A+\delta B)^{-1}=A^{-1}-\delta A^{-1}B(I+\delta A^{-1}B)^{-1}A^{-1}=A^{-1}- A^{-1}BPdiag(\mu_1,\cdots,\mu_n)P^{-1}A^{-1}$, where $\mu_i=\delta/(1+\delta\lambda_i)$. After precalculation of $P,D,U=A^{-1}BP,V=P^{-1}A^{-1}$, the required complexity is that of calculation of $Udiag(\mu_1,\cdots,\mu_n)V$ (a tensor product). If we stack the result matrix into a column, then there is a fixed $W\in M_{n^2 ,n}$ s.t. the calculation reduces to $W[\mu_1,\cdots,\mu_n]^T$, the complexity of which, is $O(n^3)$.

If $B=I, spectrum(A)=(\alpha_i)_i$ and $A$ is diagonalizable: $A=QD_1Q^{-1}$, then $(A+\delta I)^{-1}=Qdiag(\nu_i)Q^{-1}$, where $\nu_i=1/(\alpha_i+\delta)$. Again, the complexity is $O(n^3)$.

Point of view 2. $(A+\delta B)^{-1}$ is a rational fraction of $\delta$ in the form $\dfrac{1}{p_n(\delta)}Z$ where $z_{i,j}(\delta)$ is a polynomial of degree $\geq n-2$ and $p_n$ is a polynomial of degree $n$ (when $B=I$, the form is not simpler!). The required complexity is the complexity of the evaluation of $n^2$ polynomials of degree $\geq n-2$. With the Horner method, the complexity is $n^2$ times $(n$ mult, $n$ add). Using a precalculation and the Knuth/Eve method, we can do better: $n^2$ times $(n/2$ mult, $n$ add); thus, the complexity is almost halved (in fact, it is not really true because the gap between the complexity of multiplication and addition has decreased lately).

We can also write the result in the form (with precalculation of course) $(A+\delta B)^{-1}=\dfrac{1}{p_n(\delta)}(U_0+\delta U_1+\cdots \delta^{n-2}U_{n-2}+\delta^{n-1}\Delta)$ where $U_i,\Delta\in M_n$ and $\Delta$ is diagonal; again, the associated complexity is $O(n^3)$.

Conclusion: do not expect miracles.

EDIT. I just read your incomprehensible edit. That is clear: even, after precalculations, you cannot (eventually) diagonalize $A+\delta B$ with complexity $o(n^3)$ (except if $B$ is a homothety).