Inverse of a matrix with uniform off diagonals

4.2k Views Asked by At

Suppose that we have an all positive matrix where the off diagonal elements are all identical. Can one calculate the inverse of the matrix analytically, or more efficiently than the general case? For example:

$$ \begin{bmatrix} 1 & .1 & .1 \\ .1 & 2 & .1 \\ .1 & .1 & 3 \\ \end{bmatrix} $$

2

There are 2 best solutions below

3
On BEST ANSWER

Let $A$ be the matrix whose entries are $A_{i,j} = \begin{cases}a_{i} & \text{if} \ i = j \\ \alpha & \text{if} \ i \neq j\end{cases}$.

Let $D$ be the diagonal matrix with diagonal entries $D_{i,i} = a_i - \alpha$.

Then, $A = D + \alpha vv^T$ where $v$ is a $n \times 1$ vector of all ones.

If $A$ and $D$ are both invertible, then we may apply the Sherman-Morrison formula to get $$A^{-1} = (D+\alpha vv^T)^{-1} = D^{-1} - \dfrac{\alpha D^{-1} vv^T D^{-1}}{1 + \alpha v^TD^{-1}v}$$

Since $D$ is diagonal, the inverse of $D$ is simply a diagonal matrix with entries $(D^{-1})_{i,i} = (D_{i,i})^{-1} = \dfrac{1}{a_i - \alpha}$.

From here, it is easy to compute $A^{-1}$. The $i,j$-th entry of $A^{-1}$ is given by $$(A^{-1})_{i,j} = \begin{cases}\dfrac{1}{a_i-\alpha} - \dfrac{1}{c(a_i-\alpha)^2} & \text{if} \ i = j \\ - \dfrac{1}{c(a_i-\alpha)(a_j-\alpha)} & \text{if} \ i \neq j\end{cases},$$

where $c = \dfrac{1}{\alpha} + \displaystyle\sum_{i = 1}^{n}\dfrac{1}{a_i-\alpha}$.

Of course, this doesn't handle the case where $D$ isn't invertible, which occurs precisely when $a_i = \alpha$ for some $i$.

0
On

Under some circumstances. Let $\lambda$ be the number that occurs in the off-diagonal positions, $J$ the all-$1$ matrix, and $I$ the identity matrix. Then, if $A$ is an invertible diagonal matrix, and $s$ is the sum of the reciprocals of its diagonal entries, $$\eqalign{ (A-\lambda J)^{-1} &= [I - \lambda A^{-1} J]^{-1}A^{-1} = \left[ \sum_{i=0}^\infty \lambda^i (A^{-1} J)^i \right]A^{-1} \cr&= \left[I + \sum_{i=0}^\infty\lambda^i s^{i-1} (A^{-1} J) \right]A^{-1} = \left[I + {\lambda\over 1-\lambda s}\,A^{-1} J \right]A^{-1} = A^{-1} + {1\over s-\lambda s^2}\,A^{-1} J A^{-1}. }$$ You need $|\lambda s| < 1$, and a norm condition on $A^{-1}J$, in order to have convergence.

In your case, $\lambda=0.1$, $A=\pmatrix{0.9&0&0\cr 0&1.9&0\cr 0&0&2.9\cr}$, $A^{-1} = \pmatrix{1/0.9&0&0\cr 0&1/1.9&0\cr 0&0&1/2.9\cr}$, $\displaystyle s={1\over0.9}+{1\over 1.9}+{1\over 2.9}\approx 1.9822544$, which gives you $A^{-1}\approx \pmatrix{1.008078 & -0.048805 & -0.031976\cr -0.048805 & 0.503198 & -0.015146\cr -0.031976 & -0.015146 & 0.334904\cr }$.

(Written up and posted after JimmyK4542's answer, but I spent too much time (re)deriving this to just discard it.)