Recognizing Patterns in Alternating Signs Matricies and their Inverses

62 Views Asked by At

Let's say we have the matrix A with alternating-sign 1's below

A = \begin{bmatrix}1&-1&1&-1\\0&1&-1&1\\0&0&1&-1\\0&0&0&1\end{bmatrix}

If we find the inverse, we get A^-1 = \begin{bmatrix}1&1&0&0\\0&1&1&0\\0&0&1&1\\0&0&0&1\end{bmatrix}

We get a similar pattern for 5 x 5, 6 x 6, ..., n x n matricies. How would we prove that we would achieve this pattern for all inverses of n x n matricies?

3

There are 3 best solutions below

0
On

See https://en.wikipedia.org/wiki/Matrix_function and the reference Higham (2008)

Also https://en.wikipedia.org/wiki/Jordan_normal_form#Matrix_functions

The matrix you call $A^{-1}$ is already in Jordan normal form. It is already written as $I + N,$ where $I$ is the identity and $N$ is the set of $1's$ off the diagonal. The importance of $N$ as a matrix is that it is nilpotent, in particular $N^n = 0$

So, we have the series $\frac{1}{1 + x} = 1 - x + x^2 - x^3 + x^4 - x^5 ...$ which converges when $-1 < x < 1.$

In comparison, the fact that $N$ is nilpotent, and $IN=NI,$ means $$ (I+N)^{-1} = I - N + N^2 - N^3 + N^4 - + \cdots + (-1)^{(n-1)} N^{n-1} $$ and there is no need to worry about convergence.

This is a basic fact of life. Given a real analytic function, we can evaluate that function on a square matrix if we know how to find the Jordan normal form of the matrix. There are whole books on this.

0
On

To elaborate on the other answer, which is correct, though presented with no concrete examples, note that your matrix $A$ is a sum $A = I + N$, where $N$ is the nilpotent matrix $$N = \begin{bmatrix} 0 & 1 & & \\ & 0& 1 & \\ & & 0& 1 \\ & & &0 \end{bmatrix}$$ The matrix $N$ has powers of a simple form, for example $$N^2 = \begin{bmatrix} 0 & 0 &1 & \\ & 0& 0 &1 \\ & & 0& 0 \\ & & &0 \end{bmatrix}, \quad N^3 = \begin{bmatrix} 0 & 0 &0 & 1\\ & 0& 0 &0 \\ & & 0& 0 \\ & & &0 \end{bmatrix}, \quad N^4 = 0$$

It is a basic fact in commutative algebra that when you have a nilpotent element $N$, then $1 + N$ is invertible, since (I'll suppose here that $N^4 = 0$) we have $$I = I^4 + N^4 = (I + N)(I - N + N^2 - N^3)$$ and so $(I - N + N^2 - N^3)$ must be the inverse of $(I + N)$.

0
On

Hmm, you can also prove it from basic understanding of inverses of triangular matrices, and a proof by induction.

$$ \left( \begin{array}{c | c} A_{TL} & A_{TR} \\ \hline 0 & A_{BR} \end{array} \right)^{-1} = \left( \begin{array}{c | c} A_{TL}^{-1} & -A_{TL}^{-1} A_{TR} A_{BR}^{-1}\\ \hline 0 & A_{BR}^{-1} \end{array} \right) $$

Now, let's use $ A_{n \times n} $ to equal a matrix of the given form, of size $ n \times n $, and $ x_n = \left( \begin{array}{c} 1 \\ -1 \\ 1 \\ \vdots \end{array} \right) $, of size $ n $.

Notice that $$ A_{n \times n} = \left( \begin{array}{c | c} A_{k \times k} & (-1)^k x_k x_{(n-k)}^T \\ \hline 0 & A_{(n-k) \times (n-k)} \end{array} \right) $$ where $$ x_k x_{n-k}^T = \left( \begin{array}{c} 1 \\ -1 \\ 1 \\ \vdots \end{array} \right) \left( \begin{array}{c} 1 \\ -1 \\ 1 \\ \vdots \end{array} \right)^T = \left( \begin{array}{c} 1 \\ -1 \\ 1 \\ \vdots \end{array} \right) \left( \begin{array}{c c c c} 1 & -1 & 1 & \cdots \end{array} \right) = \left( \begin{array}{c c c c} 1 & -1 & 1 & \cdots \\ -1 & 1 & -1 & \cdots \\ 1 & -1 & 1 & \cdots \\ \vdots & \vdots & \vdots & \end{array} \right) $$

So...

$$ A_{n \times n}^{-1} = \left( \begin{array}{c | c} A_{k \times k}^{-1} & - (-1)^k A_{k \times k}^{-1} x_k x_{(n-k)}^T A_{(n-k) \times (n-k)}^{-1} \\ \hline 0 & A_{(n-k) \times (n-k)}^{-1} \end{array} \right) $$

Now, $$ A_{k \times k}^{-1} x_k = -(-1)^{k} e_L $$ where $ e_L $ is a vector of all zeroes except the last entry, which equals one.

Similarly, $$ x_{(n-k)}^T A_{k \times k}^{-1} = e_F $$ where $ e_F $ is a vector of all zeroes except the first entry, which equals one.

Now, I am notorious for loosing negative signs and minor typos, but you get the idea.