Matrix identity over a general field

97 Views Asked by At

This is a problem from Hoffman and Kunze: Let $A$, $B$, $C$, $D$ be $n \times n$ matrices over an arbitrary field $F$ that commute with each other, then the determinant of the the $2n \times 2n$ matrix $ M = \begin{pmatrix} A & B \\ C & D \end{pmatrix} $ is $\det(AD-BC)$.

I can see why this result is true if at least one of $A,B,C,$ or, $D$ is invertible. For example, if $A$ is invertible then from standard results on Schur complements, we have, $\det M = \det A \times \det ( D - BA^{-1}C ) = \det (A (D - BA^{-1}C) ) = \det (AD - BC).$

Using the above, one can see that this result is true when we are working over the field of complex or real numbers. For then we can always find a sequence $\epsilon_n \neq 0$ such that $\epsilon_n \to 0$ and $A_n = A+ \epsilon_n I $ is invertible. And since $A_n,B,C,$ and $D$ commute we get $\det \begin{pmatrix} A_n & B\\ C & D \end{pmatrix} = \det (A_n D - BC)$ and let $n \to \infty.$ But this argument does not generalize to general fields.

How do we approach this problem for arbitrary fields?

1

There are 1 best solutions below

2
On BEST ANSWER

You can use the same trick. Take $A_{\lambda} = A + \lambda I$. Then we have

$$\det \left( \begin{matrix} A_{\lambda}& B \\C & D \end{matrix} \right)= \det (A_{\lambda} D - B C)$$ if $\det A_{\lambda} \ne 0$, or $$\det A_{\lambda} \cdot \left( \det \left( \begin{matrix} A_{\lambda}& B \\C & D \end{matrix} \right)- \det (A_{\lambda} D - B C) \right)=0 $$

Now we have a polynomial in $\lambda$ that is $0$ for all values of $\lambda \in F$. That might not imply the polynomial being $0$ as yet, since $F$ could be finite. However, we can embed $F$ into a larger infinite field and work here. We conclude that the polynomial in $\lambda$ is $0$. Note that the polynomial $\det A_{\lambda}$ as polynomial in $\lambda$ is $\ne 0$. It follows that the polynomial in $\lambda$ inside the bracket is $0$. Now we can give $\lambda$ the value $0$ and get the desired equality.

$\bf{Added:}$ Let's prove: if matrices $A$, $B$ $C$, $D$ in $M_n(R)$, with $R$ a commutative ring, and the matrices $A$, $B$ commute, then we have the equality above.

Assume first that $A$ is invertible, that is $\det A$ is invertible in $R$. Then we can do the reduction as in the OP and get the equality, using also $AB = BA$.

Now consider the general case. Consider the ring of fraction $R' \colon = S^{-1} R[\lambda]$ of the polynomial ring $R[\lambda]$, where $S = \{\det(\lambda I + A)^m\}_{m\ge 0} $. Here the matrix $\lambda I + A$ is invertible, and commutes with $B$. We conclude that we have the equality between determinants in the ring $R'$. This is equivalent to an equality

$$\det A_{\lambda}^m \cdot \left( \det \left( \begin{matrix} A_{\lambda}& B \\C & D \end{matrix} \right)- \det (A_{\lambda} D - B C) \right)=0 $$ in $R[\lambda]$ for some $m \ge 0$. Again, because the polynomial $\det A_{\lambda}^m$ in $\lambda $ is monic, we conclude that the RHS factor is $0$ in $R[\lambda]$. Now take $\lambda= 0$.