Extending the Proof for $\det(AB) = \det(A) \cdot \det(B)$ from the $3 \times 3$ to the $n \times n$ case

228 Views Asked by At

I want to prove that $\det(AB) = \det(A) \cdot \det(B)$ in the case where $A$, $B$, and $C$ are $n \times n$ matrices.

This seems to be a generalization of the case where $A, B, and A\times B = C$ are $3 \times 3$ matrices, but it doesn't seem to me that all of the same steps hold.

Specifically, if we took $C = AB$ and took each of the three columns of $C$ as vectors, we would say $\det(C) = \vec{c}_1 \times \vec{c}_2 \cdot \vec{c_3}$.

Then, since we can write any column of $C$ as $\vec{c}_i = \sum\limits_{n=1}^3b_{n, i} \vec{a}_n$, we could do the same for all three columns and write this triple product as $\det(C) = \sum\limits_{n=1}^3b_{n, 1} \vec{a}_1 \times \sum\limits_{k=1}^3b_{k, 2} \vec{a}_2 \cdot \sum\limits_{m=1}^3b_{m, 3} \vec{a}_3$.

From this point onward, I understand the proof in the $3 \times 3$ case.

However, it seems that my professor, in generalizing this to the $n \times n$ case, has followed a similar procedure, and written this determinant of $C$ as the determinant of a product of $n$ of these factors of the same form as the above expressions for a general column.

This triple product formula as a representation of the determinant, I believe, is only applicable to the $3 \times 3$ case, which is why I don't believe he is replicating that, but it also doesn't seem to me that we could simply multiple the columns of the matrix $C$ together: the dimensions wouldn't line up. This might just be an issue with me failing to understand his write-up, though I'm hoping someone is able to make sense of this and can maybe explain the logic of how we might generalize the simpler proof.

Thanks in advance.

2

There are 2 best solutions below

0
On BEST ANSWER

It is well known that the "Determinant" is the unique antisymmetric $n$-linear map $$ Det: \overbrace{\mathbb{R}^n\times \mathbb{R}^n\times \ldots\mathbb{R}^n}^{ n-times} \to \mathbb{R} $$ which satisfies $Det(I_n)=1$(You can find this in Linear algebra of Hoffman)

Note that we identify a matrix with a $n-$tuple $(V_1,V_2,\ldots,V_n)$ where each $V_i\in \mathbb{R}^n$ correspond to a column of the matrix.

By anti-symmetric I mean $\phi $ vanish on an $n$ tuple $(V_1,V_2,\ldots,V_n)$ with $V_i=V_j$ for some $i,j$.

Now fix a matrix $A$. Then define $\phi(B)=Det(AB)/Det(A)$. Then $\phi$ is an antisymmetric $n-$linear map with $\phi(I_n)=1$. So $\phi(B)=Det(B)$.

5
On

If one of $\det(A)$ or $\det(B)=0$, then the matrix is not invertible, so the product is not invertible and its determinant is zero.

Otherwise, we may write our matrices as products of elementary matrices and use the fact that for elementary matrices determinants are multiplicative.