Invertibility of bisymmetric matrices

281 Views Asked by At

I have a bisymmetric matrix, i.e. symmetric with respect to both diagonals, and I want to show it's nonsingular.

Wikipedia suggests that all eigenvalues of a real bisymmetric matrix have the same absolute value, so finding one nonzero eigenvalue would suffice.

Is there a more simple criteria for determining the invertibility of bisymmetric matrices?

1

There are 1 best solutions below

0
On

Bisymmetric matrices have a constrained structure, but not so constrained that one can guarantee nonsingularity by looking at a single eigenvalue (unless you have a one-by-one matrix).

Nevertheless, there is enough structure present so that one can meaningfully reduce the amount of work required for eigensystem analysis versus an unstructured matrix. To see that, here is what one Fields medalist said about bisymmetric linear systems at Math Overflow a few years ago. As Thurston noted, one can effectively split the size of the original eigenspace into two half-sized eigenspaces. So, one can look for the presence of singularity in two half-sized (off-by-one if odd-dimensioned) symmetric matrices rather than the full sized matrix.

To make this observation concrete, let’s do some light analysis and work out an example. The matrix $R$ that Thurston refers to is often called the exchange matrix, and can be block diagonalized by the matrix \begin{equation} \mathbf{X} = \frac{1}{\sqrt 2}\left( \begin{array}{cc} I & \tilde{R} \\ -\tilde{R} & I \end{array} \right). \end{equation}

where $\tilde{R}$ is an exchange matrix with half the size of $R$ (insert a $\sqrt 2$ in the center if the dimension is odd). That is,

\begin{equation} \mathbf{X^T R X} = \left( \begin{array}{cc} -I & 0 \\ 0 & I \end{array} \right). \end{equation}

If $X$ is the same size as your bisymmetric matrix (call it $A$), then because $R$ and $A$ commute, it’s easy to see that $X^T A X$ also has a block diagonal structure \begin{equation} \mathbf{ X^T A X } = \left( \begin{array}{cc} \tilde A_{11} & 0 \\ 0 & \tilde A_{22} \end{array} \right). \end{equation}

where $\tilde A_{11}$ and $\tilde A_{22}$ are block matrices with size matching the corresponding block decomposition of $R$. One then has two half-sized eigensystems that can be analyzed for singularity (or for the full set of eigenvalues, if desired).

Example: let's look at the bisymmetric matrix \begin{equation} \mathbf{A} = \left( \begin{array}{ccccc} 3 & -2 & -1 & 1 & 1\\ -2 & 1 & -3 & 1 & 0 \\ -1 & -3 & 5 & -3 & -1 \\ 0 & 1 & -3 & 1 & -2 \\ 1 & 0 & -1 & -2 & 3 \end{array} \right). \end{equation}

Then \begin{equation} \mathbf{X} = \frac{1}{\sqrt 2}\left( \begin{array}{ccccc} 1 & 0 & 0 & 0 & 1\\ 0 & 1 & 0 & 1 & 0 \\ 0 & 0 & \sqrt 2 & 0 & 0 \\ 0 & -1 & 0 & 1 & 0 \\ -1 & 0 & 0 & 0 & 1 \end{array} \right). \end{equation}

produces the block diagonal matrix \begin{equation} \mathbf{X^T A X} = \frac{1}{\sqrt 2}\left( \begin{array}{ccccc} 2 & -2 & 0 & 0 & 0\\ -2 & 0 & 0 & 0 & 0 \\ 0 & 0 & 5 & -\sqrt{18} & -\sqrt2 \\ 0 & 0 & -\sqrt{18} & 2 & -2 \\ 0 & 0 & -\sqrt 2 & -2 & 4 \end{array} \right). \end{equation}

The eigenvalues of $A$ can now be obtained from the diagonal blocks of $X^T A X$, and are readily computed to be $-2$, $1-\sqrt 5$, $1 + \sqrt 5$, $5$, and $8$. Or, since you're interested in checking for singularity, simply take determinants of the two diagonal blocks in this small example.

To clarify the Wikipedia reference regarding the eigenvalues of $A$ following pre or post multiplication by the exchange matrix $R$ (alluded to in your question statement), it is stating that $RA$ and $AR$ will have the same eigenvalues as $A$ up to sign. In the example above, you’ll find that the matrix $RA$ has eigenvalues $-1-\sqrt 5$, $-2$, $-1+\sqrt 5$, $5$, and $8$. This is the same as the spectrum of $A$ aside from two sign flips.

The fact that $RA$ and $AR$ have the same eigenvalues as a bisymmetric matrix $A$ up to sign is a trivial observation. The more interesting direction is that if $RA$ and $AR$ have the same eigenvalues up to sign as a given real symmetric matrix $A$, then $A$ and $R$ must commute (i.e., this is a necessary and sufficient condition for a real symmetric matrix $A$ to be bisymmetric).