Questions concerning $ \det ({}^A_{C\,}{}^B_D) = \det ({}^D_{B\,}{}^C_A)$

126 Views Asked by At

Let $A, B, C, D $ be $n \times n $ matrices. Using Schur complements I have found that $$ \begin{pmatrix} A & B \\ C & D \end{pmatrix} = \begin{pmatrix} A & 0 \\ 0 & I \end{pmatrix} \begin{pmatrix} I & 0 \\ C & I \end{pmatrix} \begin{pmatrix} I & A^{-1}B \\ 0 & D-CA^{-1}B \end{pmatrix} $$ and $$ \begin{pmatrix} D & C \\ B & A \end{pmatrix} = \begin{pmatrix} I & CA^{-1} \\ 0 & I \end{pmatrix} \begin{pmatrix} D-CA^{-1}B & 0 \\ 0 & A \end{pmatrix} \begin{pmatrix} I & 0 \\ A^{-1}B & I \end{pmatrix}, $$ from which the determinant equality follows, as long as $ A^{-1} $ exists. However, how do I tackle this when $A$ is singular? Here, I could just switch decompositions, but then I will get the same problem when $D$ is singular. Maybe one could derive two more decompositions using Schur complements, involving $B^{-1}$ and $C^{-1}$, respectively, and then one could say that the equality holds if at least one of the submatrices is nonsingular? Then, if all four submatrices are singular, the determinant must be zero -- from which equality follows trivially.

Also, does the equality hold when $A, B, C, D$ are not necessarily square but of matching sizes? Here, it doesn't seem like the decompositions will be valid, as $A$ or $D$ aren't necessarily square matrices (although then $B$ and $C$ must be).

3

There are 3 best solutions below

2
On BEST ANSWER

Wouldn't it be much easier to consider row/column swaps? By performing $n$ row swaps, you transform $$\begin{pmatrix} A & B\\C & D \end{pmatrix} \to \begin{pmatrix} C&D\\A&B\end{pmatrix},$$ then $n$ column swaps transforms $$\begin{pmatrix} C&D\\A&B\end{pmatrix} \to \begin{pmatrix} D&C\\B&A \end{pmatrix}.$$ Each row/column swap multiplies the determinant by $-1$, so this whole process multiplies the determinant by $(-1)^{2n} = 1$.

0
On

One approach is to take advantage of the continuity of the determinant. In particular, if $A$ is singular, we have $$ \det \pmatrix{A &B\\C&D} = \lim_{t \to 0} \det\pmatrix{A + tI & B\\C&D} = \lim_{t \to 0} \det \pmatrix{D&C\\B&A + tI} = \det \pmatrix{D&C\\B&A} $$ Alternatively, you could have noted that $$ \pmatrix{D&C\\B&A} = \pmatrix{0&I\\I&0} \pmatrix{A&B\\C&D} \pmatrix{0&I\\I&0} $$ from there, we would only need to show that $$ \det \pmatrix{0&I_n\\I_n&0} = (-1)^n $$

0
On

You can use the following proposition:

Let $ A=\begin{pmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{pmatrix}$ where $A\in M_{m_1+m_2,n_1,n+2}$. Then we have

(a) If $A_{11}$ is invertible, then $\det A = \det (A_{11}) \det(A_{22}-A_{21}A_{11}^{-1}A_{12})$.

(b) If $A_{22}$ is invertible, then $\det A = \det (A_{22}) \det(A_{11}-A_{12}A_{22}^{-1}A_{21})$.

This answers your question in case if A or D in your notation is not invertible, for when $A$ is invertible, use (a) and if $A$ is not but $D$ is then use (b). (The proof of the proposition is similar to what you have shown above). Also the matrices $A_{i,j}$ need not necessarily be square, but they should be of matching sizes.

Furthermore, using this proposition, it is a trivial observation that if at least one of $A$ and $D$ is non-singular then $ \det \begin{pmatrix} A & B \\ C & D \end{pmatrix} = \det \begin{pmatrix} D & C \\ B & A \end{pmatrix} $

Finally, we require at least one of the two matrices on the principal diagonal , that is $A$ or $D$ to be non-singular.