Using a block matrix to show that $A$ and $B$ commute

108 Views Asked by At

I'm studying for a PhD qualifying exam in linear algebra and I wanted to ask about the following problem:


Let $A$ and $B$ be invertible $n\times n$ matrices. Let $M$ be the matrix \begin{bmatrix} A & B \\ B^{-1} & A^{-1} \\ \end{bmatrix} Assume that $M$ has rank $n$. Prove that $A$ and $B$ commute.


Here's what I got so far. We know that

$$0 = \det (M) = \det(A) \det \left( A^{-1} - B^{-1} A^{-1} B \right)$$

since $M$ is rank deficient. Since $A$ is invertible, $\det(A)\neq 0 $, so $\det(A^{-1}-B^{-1}A^{-1}B) = 0$.

I would like to somehow conclude that $A^{-1}-B^{-1}A^{-1}B = 0$, which would give the desired result. But I only have that this matrix is singular, not necessarily $0$. Does anyone have any suggestions for what I can try?

2

There are 2 best solutions below

1
On BEST ANSWER

Hint. Multiply by $\begin{pmatrix} B & 0\\ 0 & A\end{pmatrix}$.

Solution (spoilerised, in case you would like to try the hint first).

Since $A$ and $B$ are invertible, so is $\begin{pmatrix} B & 0\\ 0 & A\end{pmatrix}$. Multiplying by an invertible matrix doesn't change the rank. Hence$$\begin{pmatrix} A & B\\ B^{-1} & A^{-1}\end{pmatrix}\begin{pmatrix} B & 0\\ 0 & A\end{pmatrix}=\begin{pmatrix} AB & BA\\ \boldsymbol{1}_n & \boldsymbol{1}_n\end{pmatrix}$$has rank $n$ (where $\boldsymbol{1}_n$ denotes the $n\times n$-unit matrix). Subtracting the $i$-th colum from the $(n+i)$-th column for all $i=1,\dotsc,n$ (this doesn't change the rank), we see that also$$\begin{pmatrix} AB & BA-AB\\ \boldsymbol{1}_n & 0\end{pmatrix}$$has rank $n$. But $\begin{pmatrix}AB\\\boldsymbol{1}_n\end{pmatrix}$ already has rank $n$. This implies $BA-AB=0$; otherwise, there would be another non-zero column and the rank would be at least $n+1$.

0
On

This follows directly from the decomposition $$ \pmatrix{I&-BA\\ 0&A}\pmatrix{A&B\\ B^{-1}&A^{-1}}\pmatrix{B&0\\ -A&I}=\pmatrix{AB-BA&0\\ 0&I}. $$ Where does it come from: since the matrix has rank $n$ and we are asked to show that $AB-BA=0$, we expect that by block Gaussian elimination, we can transform the given matrix to a direct sum $X\oplus Y$, where $X$ is somehow equivalent to $AB-BA$ and $Y$ is invertible. Indeed, by Gaussian elimination, we can kill the two off-diagonal subblocks one by one: \begin{align} \pmatrix{I&-BA\\ 0&I}\pmatrix{A&B\\ B^{-1}&A^{-1}}=\pmatrix{A-BAB^{-1}&0\\ B^{-1}&A^{-1}},\tag{1}\\ \pmatrix{A-BAB^{-1}&0\\ B^{-1}&A^{-1}}\pmatrix{I&0\\ -AB^{-1}&I}=\pmatrix{A-BAB^{-1}&0\\ 0&A^{-1}}.\tag{2} \end{align} At this point we are now basically done, as $A-BAB^{-1}=(AB-BA)B^{-1}$. To make the matrix decompositions more eye-pleasing, we further modify the matrix on the RHS of $(2)$ to $$ \pmatrix{I&0\\ 0&A}\pmatrix{A-BAB^{-1}&0\\ 0&A^{-1}}\pmatrix{B&0\\ 0&I} =\pmatrix{AB-BA&0\\ 0&I}.\tag{3} $$ Put $(1)-(3)$ together, we obtain the matrix decomposition at the beginning of this answer.