Inverse of matrices that differ only by one element

436 Views Asked by At

Let's suppose that a real matrix $\textbf{A}_{n\times n}$ is nonsingular and its inverse is $\textbf{A}^{-1}_{n\times n}$. Next we change its $A_{ij}$ element to $A_{ij}+a$ and we keep all the other elements unchanged and we get a new matrix say $\textbf{B}_{n\times n}$, where $a\neq 0$. I want to find $\textbf{B}^{-1}$ with respect to $\textbf{A}^{-1}$. What I have done:

If $$ \textbf{A}= \begin{bmatrix} A_{11} & A_{12} & \dots & A_{1n} \\ \dots & \dots & \dots & \dots \\ \dots & \dots & A_{ij} & \dots \\ \dots & \dots & \dots & \dots \\ A_{n1} & A_{n2} & \dots & A_{nn} \end{bmatrix} $$ then $$ \textbf{B}= \begin{bmatrix} A_{11} & A_{12} & \dots & A_{1n} \\ \dots & \dots & \dots & \dots \\ \dots & \dots & A_{ij}+a & \dots \\ \dots & \dots & \dots & \dots \\ A_{n1} & A_{n2} & \dots & A_{nn} \end{bmatrix} = \begin{bmatrix} A_{11} & A_{12} & \dots & A_{1n} \\ \dots & \dots & \dots & \dots \\ \dots & \dots & A_{ij} & \dots \\ \dots & \dots & \dots & \dots \\ A_{n1} & A_{n2} & \dots & A_{nn} \end{bmatrix} + \begin{bmatrix} 0 & 0 & \dots & 0 \\ \dots & \dots & \dots & \dots \\ \dots & \dots & a & \dots \\ \dots & \dots & \dots & \dots \\ 0 & 0 & \dots & 0 \end{bmatrix} \\ =\textbf{A}+a\textbf{e}_i\textbf{e}_j^T $$

where $\textbf{e}_i,\ \textbf{e}_j \in \mathbb{R}^{n}$.

Then: $$\textbf{B}_{n\times n}^{-1}=[\textbf{A} +a\textbf{e}_i\textbf{e}_j^T]^{-1}=\textbf{A}^{-1}+a[\textbf{e}_i\textbf{e}_j^T]^{-1}$$

Is this correct? Also, is there something better I can get?

2

There are 2 best solutions below

2
On BEST ANSWER

A correct formula can be given for the inverse of a matrix when one entry is changed, and more generally when the change to a matrix is a rank-one update. The general result is known as the Sherman-Morrison formula.


We can write the updated matrix $B$ in the form $A + ae_i e_j^T$, where $e_i$ is a column vector of length $n$, and correspondingly:

$$ e_j^T = \begin{pmatrix} 0 & \ldots & 0 & 1 & 0 & \ldots & 0 \end{pmatrix} $$

is the row vector of length $n$ with $1$ in the $j$th position and zeros elsewhere.

The Question asks essentially if $B^{-1} = A^{-1} + a[e_i^Te_j]^{-1}$. But the latter term makes no sense if $n\gt 1$, since the rank-one matrix $e_i^T e_j$ is then never invertible (having rank less than $n$).

It can even be checked that the proposed formula is wrong for $n=1$, just by multiplying out $(A^{-1} + a[e_i^Te_j]^{-1})(A + ae_i^T e_j)$, but we leave this as an exercise for the interested reader.


The Sherman-Morrison formula specializes to the case of changing a single entry $A_{ij}$ to $A_{ij} + a$ as follows:

$$ (A + ae_i e_j^T)^{-1} = A^{-1} - a\frac{A^{-1}e_i e_j^T A^{-1}}{1 + a e_j^T A^{-1}e_i} $$

subject to the condition that we are not "dividing by zero", i.e. that the scalar:

$$ 1 + ae_j^T A^{-1}e_i \neq 0 $$

Note that this denominator is simply some real number because $e_j^T$ is a row and $e_i$ is a column, so that the "size" of the latter matrix product is $1\times 1$. Indeed this matrix product picks out the $j$th row, $i$th column entry of $A^{-1}$:

$$ e_j^T A^{-1}e_i = [A^{-1}]_{ji} $$

1
On

No, it is not correct. In fact, the matrix $B$ can be singular.

Take for example the identity, that it is certainly regular. If you add $-1$ to any term in the main diagonal, you get a singular matrix.