Prove $|A.B|=|A|.|B|$ using matrix algebra

264 Views Asked by At

How to prove that the determinant of the product of matrices is equal to product of their respective determinants ?.ie,

$$ |A.B|=|A|.|B| $$ where $A,B$ are square matrices of the same order.

I have no clue of where to start, I just know the following $$ |A|I=A(adj A)=(adj A)A $$ Note: I have checked a similar problem at Proving determinant product rule combinatorially which gives a combinatorial proof of the statement which is kinda cumbersome by the use of Combinatorics in a matrix problem. So I hope its worth asking for an alternative proof using only the properties of matrices.

3

There are 3 best solutions below

3
On

Let $A,B\in \Re^{n\times n}$ be two $n\times n$ matrices.

Now we can distinguish between 2 cases:

Case 1: $A$ isn't invertible. Therefore the product $AB$ won't be invertible either. In this case, $|AB|=|A|\cdot|B|=0$ holds true (both sides are equal to $0$).

Case 2: $A$ is invertible. In this case, both $A$ and the Identity Matrix $I_n$ are equivalent by rows: $$A=E_p\cdot E_{p-1}\ldots E_1\cdot I_n=E_p\cdot E_{p-1}\ldots E_1$$

And therefore: $$\begin{align*} |AB| &= |E_p\cdot E_{p-1}\ldots E_1\cdot B| \\ &=|E_p|\cdot|E_{p-1}\ldots E_1\cdot B| \\ &= |E_p|\cdot|E_{p-1}|\cdot |\ldots E_1\cdot B| \\ &=\ldots \\ &= |E_p\ldots E_1|\cdot |B| \\ &= |A|\cdot|B|\end{align*}$$

18
On

Thanx @ZeroXLR for the hint.

$$ |A.B|=|A|.|B| $$

$\color{blue}{\text{Theorem 1}}:$ Suppose that $A$ is a square matrix of size $n$ and $E$ is any elementary matrix of size $n$, then $$\boxed{ |E.A|=|E|.|A|} $$

Proof :

1.) $A=[a_{i,j}]$ and $B=E_l(k).A=[b_{i,j}]$ where $b_{l,j}=k.a_{l,j}$

$$ |E_l(k).A|=|B|=\sum_{j=1}^{n} (-1)^{l+j}.b_{l,j}.|b(l\ |\ j)|\\ =\sum_{j=1}^{n} (-1)^{l+j}.b_{l,j}.|a(l\ |\ j)|\\ =\sum_{j=1}^{n} (-1)^{l+j}.k.a_{l,j}.|a(l\ |\ j)|\\ =k.\sum_{j=1}^{n} (-1)^{l+j}.a_{l,j}.|a(l\ |\ j)|\\ \color{red}{|E_l(k).A|=k.|A|=|E_l(k)|.|A|} $$ 2.) Swapping of adjascent rows, $A=[a_{i,j}]$ and $B=E_{l,l+1}.A=[b_{i,j}]$ where $a_{l,j}=b_{l+1,j}$ and $a_{l+1,j}=b_{l,j}$ $$ |E_{l,l+1}.A|=|B|=\sum_{j=1}^{n}(-1)^{l+1+j}.b_{l+1,j}.|b(l+1\ |\ j)|\\ =\sum_{j=1}^{n}(-1)^{l+j}.(-1)^1.a_{l,j}.|a(m\ |\ j)|\\ =(-1).\sum_{j=1}^{n}(-1)^{l+j}.a_{l,j}.|a(m\ |\ j)|\\ \color{red}{|E_{l,l+1}.A|=-|A|=|E_{l,l+1}|.|A|} $$

3.) $A=[a_{i,j}]$ and $B=E_{l,m}(k).A=[b_{i,j}]$ where $b_{m,j}=a_{m,j}+k.a_{l,j}$ $$ |E_{l,m}(k).A|=|B|=\sum_{j=1}^{n} (-1)^{m+j}.b_{m,j}.|b(m\ |\ j)|\\ =\sum_{j=1}^{n} (-1)^{m+j}.(a_{m,j}+k.a_{l,j}).|a(m\ |\ j)|\\ =\sum_{j=1}^{n} (-1)^{m+j}.a_{m,j}.|a(m\ |\ j)|+k.\sum_{j=1}^{n} (-1)^{m+j}.a_{l,j}.|a(m\ |\ j)|=|A|+0\\ \color{red}{|E_{l,m}(k).A|=|A|=|E_{l,m}(k).A|.|A|} $$

$\color{blue}{\text{Theorem 2}}:$ For the three possible elementary matrix we have the determinants, $$ \boxed{|E_{i,j}|=-1\\ |E_i{(k)}|=k\\ |E_{i,j}(k)|=1} $$ Proof :

Using $\color{blue}{\text{Theorem 1}}$,

$$ |E_{i,j}|=|E_{i,j}.I|=-|I|=-1\\ |E_i(k)|=|E_i(k).I|=k|I|=k.1=k\\ |E_{i,j}(k)|=|E_{i,j}(k).I|=|I|=1 $$

$\color{blue}{\text{Theorem 3}}:$ If $A$ is a nonsingular matrix, then there exists elementary matrices $E_1,E_2,E_3....E_n$ so that,

$$\boxed{ A=E_n....E_3.E_2.E_1.I=E_n....E_3.E_2.E_1} $$

$\color{blue}{\text{Theorem 4}}:$

A matrix $A$ is called singular if the equation $Ax=0$ has a nonzero solution.

Proof :

If $A$ is non-singular, ie. $|A|\neq0$, $$ Ax=b\implies A^{-1}\big(Ax\big)=A^{-1}b \quad (A \text{ is non-singular}\implies A\text{ is invertible}) \\Ix=A^{-1}b\implies x=A^{-1}b $$ Thus, $$ Ax=0\implies x=0 $$ If $A$ is non-singular $Ax=0$ can have only the trivial solution $x=0$. Thus, $$\boxed{ Ax=0 \text{ has non-trivial solution}\iff |A|=0 \text{ }\big(\text{i.e, }A\text{ is singular}\big)} $$

Case 1 : If $B$ is singular,

From $\color{blue}{\text{Theorem 4}}$,

there is a vector $x\neq 0$ such that $Bx=0$. Hence, $$ ABx=A(Bx)=A.0=0 $$ The equation $ABx=0$ also has a nontrivial solution $x\neq0$. Hence, $AB$ is singular ie. $|AB|=0$ $$ |AB|=0=|A||B| $$

Case 2 : If $B$ is non-singular,

$$ |AB|=|A.E_n....E_3.E_2.E_1|\qquad\color{blue}{\big(\text{Theorem 3}\big)} \\=|A|.|E_n|....|E_3|.|E_2|.|E_1|\qquad\color{blue}{\big(\text{Theorem 1}\big)} \\=|A|.|E_n....E_3.E_2.E_1|\qquad\color{blue}{\big(\text{Theorem 1}\big)} \\=|A|.|B|\qquad\color{blue}{\big(\text{Theorem 3}\big)}\qquad\color{white}{\big(\text{There.}\big)} $$

5
On

Notations like $\sum_j$ or $\prod_k$ will always mean $\sum_{j=1}^n$ and $\prod_{k=1}^n$; there's going to be enough typing here even with that convention.

If $\sigma\in S_n$ define $|\sigma|=1$ if $\sigma$ is even, $-1$ if $\sigma$ is odd. The simplest definition of the determinant is $$|A|=\sum_{\sigma\in S_n}|\sigma|\prod_ja_{j,\sigma(j)}.$$

Ok, maybe that doesn't look as simple as various other definitions, but it's often the simplest to use in proving things about determinants; if whatever formula is true it must follow by just plugging in and working it out. We're going to use this to show the determinant is multiplicative - the proof is "simple" in that it uses absolutely no previous results, nothing but the fact that multiplication distributes over addition.

In general what is the product of a bunch of sums, $\prod_j\sum_k\alpha_{jk}$? It's the sum of products, each product being the product of one term from the first sum times one term from the second sum, etc. That is, if we let $X$ be the set of all functions from $\{1,2,\dots,n\}$ to itself, in general we have $$\prod_j\sum_k\alpha_{jk} =\sum_{\psi\in X}\prod_j\alpha_{j,\psi(j)}.$$

We extend the notation $|\sigma|$ to $X$ by saying $|\sigma|=0$ if $\sigma\in X\setminus S_n$. Note that if $\sigma,\psi\in X$ and $\sigma\psi\in S_n$ then $\sigma$ must be surjective and $\psi$ must be injective, so in fact $\sigma,\psi\in S_n$. Hence $$|\sigma\psi|=|\sigma|\,|\psi|\quad(\sigma,\psi\in X).$$

Now say $AB=C$. In the obvious notation this says$$c_{jk}=\sum_ia_{ji}b_{ik}.$$So if $\sigma\in S_n$ then$$ \sum_jc_{j,\sigma(j)}=\sum_{\psi\in X}\prod_ja_{j,\psi(j)}b_{\psi(j),\sigma(j)}.$$

Hence $$|AB|=\sum_{\psi\in X}\sum_{\sigma\in S_n}|\sigma|\prod_ja_{j,\psi(j)}b_{\psi(j),\sigma(j)}.$$

Now note that since $|\psi|=0$ for $\psi\in X\setminus S_n$ we have $$|A|=\sum_{\psi\in X}|\psi|\prod_ja_{j,\psi(j)}.$$Multiplying this by the original definitoin of $|B|$ gives $$|A|\,|B|=\sum_{\psi\in X}\sum_{\phi\in S_n}|\sigma\psi|\prod_ja_{j,\psi((j)}b_{j,\sigma(j)}.$$

Since for every $\psi\in X$ either $|\psi\sigma|=0$ or $\psi$ is a permutation of $\{1,2\dots,n\}$ that last is equal to $$\sum_{\psi\in X}\sum_{\phi\in S_n}|\sigma\psi|\prod_ja_{j,\psi((j)}b_{\psi(j),\sigma(\psi(j))}.$$

And again, since if $|\psi|\ne0$ then $\sigma\mapsto\sigma\psi$ is a bijection on $S_n$, that last expression is equal to $$\sum_{\psi\in X}\sum_{\phi\in S_n}|\sigma|\prod_ja_{j,\psi((j)}b_{\psi(j),\sigma(j)},$$which is $|AB|$.