Proof that $\det(AB) = \det(A) \det(B)$

4.3k Views Asked by At

I have been working on this proof and, after some very helpful comments from people on here, I think I may understand it. I was hoping someone could look this over. I'll try to explain each step in as much depth as I can. I apologize for rookie mistakes: I'm still quite new to this subject.

$\textbf{Theorem.}$ For $n \times n$ matrices $A$ and $B$, $\det(AB) = \det(A) \cdot \det(B)$.

$\textbf{Proof.}$ Take $C = AB$. Any arbitrary column of $C$ will take the form \begin{align*} \vec{c}_i & = A \vec{b}_i & & \text{Definition, $C = AB$, and matrix multiplication} \\ & = A \sum_{k=1}^n b_{k,i} \vec{e}_k & & \text{Write each column of $B$ as sum of coefficients multiplied by $k$th standard basis vector} \\ & = \sum_{k=1}^n b_{k,i} A (\vec{e}_k) & & \text{Linearity of $A$} \\ & = \sum_{k=1}^n b_{k,i} \vec{a}_k & & \text{A matrix acting on the $k$th standard basis vector returns the $k$th column of $A$} \end{align*} Thus, the matrix $C$ contains $n$ columns of this form $\vec{c}_i$. In other words, we have $$ \det(C) = \det (\vec{c}_1, \vec{c}_2, \vec{c}_3, \ldots, \vec{c}_n) $$ or, with somewhat messier notation, \begin{align*} \det(C) = \det \Bigg(\sum_{n_1=1}^n b_{n_1, 1} \vec{a}_{n_1} , \sum_{n_2=1}^n b_{n_2, 2} \vec{a}_{n_2}, \sum_{n_3=1}^n b_{n_3, 3} \vec{a}_{n_3}, \ldots, \sum\limits_{n_n=1}^n b_{n_n, n} \vec{a}_{n_n}\Bigg) \end{align*} Next, we can draw on the multilinearity property of the determinant function, which tells us that determinant function is linear in every column of its argument. This means that if one of our vectors is a linear combination of some other number of vectors, say (with $\vec{a}_i$ as some arbitrary column of some matrix $A$), $\vec{a}_i = \alpha \vec{b} + \beta \vec{c}$, we can write \begin{align*} \det(A) = \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{a}_i, \ldots \vec{a}_n) & = \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \alpha \vec{b} + \beta \vec{c}, \ldots \vec{a}_n) \\ & = \alpha \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{b}, \ldots \vec{a}_n) + \beta \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{c}, \ldots \vec{a}_n) \end{align*} This then allows us to pull out our coefficients, of the form $b_{n_k, k}$, summed over $n_1, n_2, n_3, \ldots, n_n$. This gives us, after factoring out $det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{a}_n) = \det(A)$: \begin{align*} \det(C) = \sum\limits_{n_1, n_2, n_3, n_4, \ldots, n_n} b_{n_1, 1} b_{n_2, 2} \cdots b_{n_n, n} \cdot det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{a}_n) & = \sum\limits_{n_1, n_2, n_3, n_4, \ldots, n_n} b_{n_1, 1} b_{n_2, 2} \cdots b_{n_n, n} \cdot \det(A) \end{align*} Let's now denote $\sum\limits_{n_1, n_2, n_3, n_4, \ldots, n_n} b_{n_1, 1} b_{n_2, 2} \cdots b_{n_n, n}$, fixed for any $A$, as $f(B)$, and thus we have $$ \det(C) = \det(AB) = f(B) \cdot \det(A) $$ Since $f(B)$ is constant, it is fixed for any matrix $A$. Take $A = I$, in which case $\det(A) = \det(I) = 1$, and also $\det(C) = \det(AB) = \det(IB) = \det(B)$. We thus get $$ \det(B) = f(B) \cdot 1 = f(B) $$ Thus, $\det(B) = f(B)$. Thus, $\det(C) = f(B) \cdot \det(A) \implies \det(C) = \det(B) = \det(A)$, and thus $\det(AB) = \det(A) \cdot \det(B)$, as desired.

How does this proof look? My apologies for the rather messy notation.

2

There are 2 best solutions below

0
On BEST ANSWER

First: Typo at end. $\det(C) = \det(B) = \det(A)$ is incorrect.

Second: Perhaps the indices $k_1, \dots, k_n$ would be better than $n_1, \dots, n_n$.

Third: The statement $f(B) = \det(B)$ cannot be correct. For instance, consider the matrix $\begin{pmatrix} 1 & 1 \\ 1 & 1\end{pmatrix}$, whose determinant is $0$, but $f(B) > 0$. There are signs missing in the expression for $f(B)$.

Fourth: The place that you got this incorrect is when you pulled out the $b$'s and then you got $\det(\vec a_1, \dots, \vec a_n)$, but you should have gotten $\det(\vec a_{n_1}, \dots, \vec a_{n_n})$. This is why there are missing signs in your function $f(B)$.

0
On

The answer by user357980 notes a few issues with your argument. The main issue in my view, is that it can be done with a lot less work. Namely, one only needs to show that

  • $\det A'=-\det A$, if $A'$ is obtained from $A$ by exchanging two columns

  • $\det A'=c\,\det A$, if $A'$ is obtained from $A$ by multiplying a column by $c$

  • $\det A'=\det A$, if $A'$ is obtained from $A$ by replacing one column of $A$ with said column plus a multiple of another column

With the above given, the three rules become

  • $\det EA=\det E\,\det A$, if $E$ is the elementary matrix that flips two columns

  • $\det EA=\det E\,\det A$, if $E$ is the elementary matrix that multiplies a column by a number

  • $\det EA=\det E\,\det A$, if $E$ is the elementary matrix that adds a multiple of a column to a column.

Now, any invertible $B$ is a product of elementary matrices, so we obtain directly that $\det BA=\det B\det A$.

If $B$ is not invertible, then $\det B=0$, and also $BA$ is not invertible, so $\det B=0$. Thus $\det BA=\det B\det A$.