On the eigenvalues of a block-symmetric matrix

965 Views Asked by At

I'm trying to find some kind of property that can tell me something about the invertibility of the matrix $$ M = \begin{bmatrix} A & B \\ -B^{T} & A \end{bmatrix}$$

Where both $A$ and $B$ are square-invertible (but not symmetric) and the eigenvalues of $A$ are pure-imaginary (I don't know if this is a plus or not).

Is there any "easy" condition to say that $M$ is going to be invertible?

Thanks in advance!

2

There are 2 best solutions below

1
On BEST ANSWER

Suppose $A, B \in\mathbb R^{n\times n}$ and let

$$A = \pmatrix{ -&-&\mathbf a_1&-&-\\ -&-&\mathbf a_2&-&-\\ &&\vdots\\ -&-&\mathbf a_n&-&-\\ },$$

where $\mathbf a_i\in \Bbb R^n$, and similarly for $B$ and $\mathbf b_i\in\Bbb R^n$.

We have that $M$ fails to be invertible if and only if there is nonzero $\mathbf w \in \mathbb R^{2n}$ with $M\mathbf w = 0$. Given $\mathbf w = (u_1,u_2,\dots,u_n, v_1,v_2,\dots,v_n)\in\Bbb R^{2n}$, let $\mathbf w^1 = P_1 \mathbf w = (u_1,u_2,\dots,u_n)$ and similarly let $\mathbf w^2 = P_2 \mathbf w = (v_1,v_2,\dots, v_n)$. Then $M$ fails to be invertible if and only if $P_1 M\mathbf w = P_2 M \mathbf w = 0$.

We have that

\begin{align} P_1 M\mathbf w &= \pmatrix{ \langle \mathbf a_1, \mathbf w^1\rangle + \langle \mathbf b_1, \mathbf w^2 \rangle\\ \langle \mathbf a_2, \mathbf w^1\rangle + \langle \mathbf b_2, \mathbf w^2 \rangle\\ \vdots\\ \langle \mathbf a_n, \mathbf w^1\rangle + \langle \mathbf b_n, \mathbf w^2 \rangle }\\ &= \pmatrix{ \langle \mathbf a_1, \mathbf w^1\rangle\\ \langle \mathbf a_2, \mathbf w^1\rangle\\ \vdots\\ \langle \mathbf a_n, \mathbf w^1\rangle } + \pmatrix{ \langle \mathbf b_1, \mathbf w^2 \rangle\\ \langle \mathbf b_2, \mathbf w^2 \rangle\\ \vdots\\ \langle \mathbf b_n, \mathbf w^2 \rangle } = A \mathbf w^1 + B\mathbf w^2 \end{align}

and similarly, $P_2 M\mathbf w = B^T\mathbf w^1 + A \mathbf w^2$. Hence, $M$ to fail to be invertible if and only if we have

$$\left\{\begin{array}{ccc} A \mathbf w^1 &=& - B\mathbf w^2\\ A \mathbf w^2 &=& - B^T\mathbf w^1 \end{array}\right.\tag{$*$}$$

Since $A$ is invertible, the second equation yields $\mathbf w^2 = - A^{-1}B^T\mathbf w^1$ and substituting back into the first we obtain

$$A \mathbf w^1 = BA^{-1}B^T\mathbf w^1 \implies \left(BA^{-1}B^T - A\right)\mathbf w^1 = 0.\tag{$**$}$$

Notice that because $A$ and $B$ are both invertible and $\mathbf w$ is nonzero, the system $(*)$ implies that both $\mathbf w^1$ and $\mathbf w^2$ are also nonzero. In other words, we may conclude from $(**)$ that $BA^{-1}B^T - A$ is not invertible $($and hence has $\det = 0)$.

On the other hand, it's easy to see that when $BA^{-1}B^T - A$ fails to be invertible, we can go the reverse direction and arrive at the system $(*)$, so this is both a necessary and sufficient condition.

1
On

I want to look at this more generally and show that for

$$ H=\begin{pmatrix}A &B\\C &D\end{pmatrix} $$

with $A,B,C,D\in\mathbb{F}^{(n,n)}$ and $A$ invertible that

$$\det H=\det A\cdot\det (D-CA^{-1}B)$$


For this, we first observe the following:

Proposition: Let $A,C\in\mathbb{F}^{(n,n)}$. For every $U\in\mathbb{F}^{(n,n)}$, there is a sequence of row transformation $r_i\mapsto r_i+\lambda r_j$ for $i\neq j$ (that is replacing one row by itself plus a scalar multiple of another distinct one) that transforms $$\begin{pmatrix}A\\C\end{pmatrix}\text{ into }\begin{pmatrix}A\\C+UA\end{pmatrix}$$ Proof: We have that for $U=E_{ij}$, the matrix with $0$ everywhere except in the $ij$-th position, $E_{ij}A$ is the matrix whose $k-th$ row vector is $\mathbf0$ for $k\neq i$ and the $j$-th row vector of $A$ for $k=i$. Clearly, for this elementary matrix $E_{ij}=U$, we can achieve the form by applying the row transformation of adding the $j$-th row vector of A to the $i$-th row vector of $C$. Similarly, for $U=\lambda E_{ij}$, we may perform the transformation of adding the $\lambda$-fote of the $j$-th row of $A$ to the $i-th$ row of $C$. Now, let $U\in\mathbb{F}^{(n,n)}$, then $U=\sum_{i,j\leq n}u_{ij}E_{ij}$. Now, the sequence of row operations corresponding to every $u_{ij}E_{ij}$ has the effect of transforming $C$ into $C+UA$. $\Box$


You may check yourself that this type of row transformation applied to a square matrix leaves the determinant invariant. This property is what we are going to use in the following:

Proposition: Let $A,B,C,D\in\mathbb{F}^{(n,n)}$ and $$ H=\begin{pmatrix}A &B\\C &D\end{pmatrix} $$ If $A$ is invertible, then $\det H=\det A\cdot\det (D-CA^{-1}B)$.

Proof: Let $U=-CA^{-1}$. Applying the same row transformation as in the above proposition to $H$(thus the $B$ side gets transformed simultaneously) yields the following, as $H$ is in block form: $$H'=\begin{pmatrix}A &B\\C-CA^{-1}A &D-CA^{-1}B\end{pmatrix}=\begin{pmatrix}A &B\\\mathbf0 &D-CA^{-1}B\end{pmatrix}$$ As I remarked before, these row transformations leave the determinant invariant, i.e. $\det H=\det H'$. Now, I use without proof that the determinant of a block upper triangle matrix is the product of the determinants of the diagonal blocks, i.e. it follows that $\det H=\det H'=\det A\cdot\det(D-CA^{-1}B)$. $\Box$

Check this last property concerning the determinant of a block upper triangle matrix as well if you are unsure about it.


Let's apply this to your problem. Your $M$ is square and your $A,B$ are as well, with your $A$ invertible. By the above proposition, we have that $\det M=\det A\cdot\det(A-(-B^\top)A^{-1}B)=\det A\cdot\det(A+B^\top A^{-1}B)$.

So, we derive that $\det M\neq 0$ iff $\det(A+B^\top A^{-1}B) \neq 0$ as $\det A\neq 0$ by assumption.