Perturbations to a matrix causing drastic changes to matrix inverse.

133 Views Asked by At

I'm reading this article about matrix norms because I want to understanding the math behind SVD.

One of the interesting issues it brings up quite soon is the effect of perturbations to a matrix on its inverse.

$$A = \begin{bmatrix} 100 & 100 \\ 100.1 & 100\ \end{bmatrix}\\ A^{-1} = \begin{bmatrix} -5 & 5 \\ 5.01 & -5\\ \end{bmatrix}\\ $$

And the perturbed matrix of $A$, $\Delta A$ and its inverse.

$$ \Delta A = \begin{bmatrix} 100 & 100\\ 100.2 & 100\\ \end{bmatrix}\\ (\Delta A)^{-1} = \begin{bmatrix} -10 & 10\\ 10.01 & -10\\ \end{bmatrix} $$

A minor change to the original matrix drastically alters the resulting inverse.

I'm interested in a formal approach to describing how the inverse of a matrix is affected by perturbations to the original matrix, i.e. how to describe $\frac{\partial A^{-1}}{\partial A_{ij}}$.

Does this have something to do with Jacobians?

Edit: taking advice from @Doug M, I plotted $det(\Delta A)$ over perturbations to $A_{2,1}$ over the interval $]0,5]$ of distance 0.1. As seen in the graph, with smaller perturbations the determinant is larger, but as the perturbations get larger, the determinant approaches 0enter image description here