Inverse matrix of matrix (all rows equal) plus identity matrix

1.1k Views Asked by At

Let $A$ be a matrix where all rows are equal, for example,

$$A=\left[\begin{array}{ccc} a_{1} & a_{2} & a_{3} \\ a_{1} & a_{2} & a_{3} \\ a_{1} & a_{2} & a_{3} \end{array}\right]$$

Then what is the inverse of the matrix $B=I+A$, where $I$ is the identity matrix? For example,

$$B=\left[\begin{array}{ccc} a_{1}+1 & a_{2} & a_{3} \\ a_{1} & a_{2}+1 & a_{3} \\ a_{1} & a_{2} & a_{3}+1 \end{array}\right]$$

I have a conjecture, which computation has so far confirmed:

$$B^{-1}=I-\frac{A}{\mbox{tr}(A)+1}$$

Why is this true?

3

There are 3 best solutions below

1
On

$A=ea^T$ where $a=\begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix}$ and $e = \begin{bmatrix} 1 \\ 1 \\ 1\end{bmatrix}.$

Now, we can use the Sherman-Morrison formula, which states that a matrix $C$ is invertible, then $C+uv^T$ is invertible if $1+v^TC^{-1}u \ne 0$ and $$(C+uv^T)^{-1}=C^{-1}-\frac{C^{-1}uv^TC^{-1}}{1+v^TC^{-1}u}.$$ $B=I+ea^T$, hence $B$ is invertible if $1+a^Te=1+\sum_{i=1}^3a_i=1+trace(A) \ne 0.$ and $$B^{-1}=I-\frac{ea^T}{1+trace(A)}=I-\frac{A}{1+trace(A)}$$

1
On

In the following, let us assume that $\mbox{tr}(A)+1\not = 0$.

We have $B=A+I$. Let us compute $C:=(A+I)\left(I-\frac{A}{\mbox{tr}(A)+1}\right)$ and see if we get the identity matrix.

$$C=-\frac{1}{\mbox{tr}(A)+1}A^2+\frac{\mbox{tr}(A)}{\mbox{tr}(A)+1}A+I$$

so that to get the desired result, we need to prove that $A^2=\mbox{tr}(A)A$.

The $i,j$ coefficient of $A^2$ is given by $$\sum_{k=1}^n A_{i,k}A_{k,j}=\sum_{k=1}^n a_ka_j=\mbox{tr}(A)a_j=\mbox{tr}(A)A_{i,j} $$

So that indeed, we obtain the desired identity.

NB : To justify that the matrix $B=A+I$ is invertible if and only if $\mbox{tr}(A)+1\not = 0$, note that the above computation actually establishes the identity $$(\mbox{tr}(A)+1)I=(A+I)((\mbox{tr}(A)+1)I-A)$$
from which the equivalence follows.

0
On

The inverse of $B$: $$B^{-1}=\frac{\text{adj}{(B)}}{\det(B)}.$$ The determinant of $B$: $$\det(B)=\begin{vmatrix}a_{1}+1 & a_{2} & a_{3} \\ a_{1} & a_{2}+1 & a_{3} \\ a_{1} & a_{2} & a_{3}+1\end{vmatrix}= \begin{vmatrix}a_{1}+1 & a_{2} & a_{3} \\ -1 & 1 & 0 \\ -1 & 0 & 1\end{vmatrix}=a_1+a_2+a_3+1.$$ The adjugate of $B$: $$\text{adj}(B)=\text{C}^T=\\ \begin{pmatrix} (a_2+1)(a_3+1)-a_2a_3 & -a_2(a_3+1)+a_2a_3 & a_2a_3-a_3(a_2+1) \\ -a_1(a_3+1)+a_1a_3 & (a_1+1)(a_3+1)-a_1a_3 & -a_3(a_1+1)+a_1a_3 \\ a_1a_2-a_1(a_2+1) & -a_2(a_1+1)+a_1a_2 & (a_1+1)(a_2+1)-a_1a_2 \end{pmatrix}=\\ \begin{pmatrix} a_1+a_2+a_3+1-a_1 & -a_2 & -a_3 \\ -a_1 & a_1+a_2+a_3+1-a_2 & -a_3 \\ -a_1 & -a_2 & a_1+a_2+a_3+1-a_3 \end{pmatrix}=\\ \begin{pmatrix} a_1+a_2+a_3+1 & 0 & 0 \\ 0 & a_1+a_2+a_3+1 & 0 \\ 0 & 0 & a_1+a_2+a_3+1 \end{pmatrix}- \begin{pmatrix} a_1 & a_2 & a_3 \\ a_1 & a_2 & a_3 \\ a_1 & a_2 & a_3 \end{pmatrix}. $$ Note: $C^T$ is the transpose of the cofactor matrix of $B$.

Hence, the result follows.