How can I prove that if $A$ is a singular square matrix, then $\det A = 0$ without using Binet's Theorem?

4k Views Asked by At

Well, I need to prove that if $A$ is a $n \times n$ matrix and it is singular, then $\det A = 0$, in order to show Binet's Theorem $\det(AB) = \det(A)\cdot\det(B)$ in the case when both $A$ and $B$ are singular square matrices.

I'll thank any help with this issue!

5

There are 5 best solutions below

2
On BEST ANSWER

Consider $B=A \cdot C^T$ where $C$ is the cofactor matrix of $A$.

Then $b_{i,j}=\sum_k a_{i,k} \, c_{j,k}$, where $c_{j,k}$ is the $j,k$ minor of $A$.

If $i = j$ , that corresponds to the computation of the determinant of $A$ along the row $i$, hence $b_{i,i}=\det A$

If $i\ne j$ , that corresponds to the computation of the determinant of a matrix that's equal to $A$ except that the row $j$ has been overwriten by the contents of row $i$. But the determinant of a matrix with duplicated rows is zero, hence $b_{i,j}=0$.

This can be written $$A \cdot C^T = \det(A) \, I \tag{1}$$

If $\det A\ne 0$ then we can write $$ A \, \frac{C^T}{\det A}=I \iff A^{-1}=\frac{C^T}{\det A}=\frac{adj(A)}{\det A} \tag{2}$$

(This esentially amounts to rediscovering/proving Cramer's rule)

That is, $\det A\ne 0 \implies A^{-1} $ exists (hence, $A$ is non singular).

Consequently, if $A$ is singular (i.e. the inverse does not exist) then we must have $\det A=0$.

2
On

By definition, $\det$ is an alternating multilinear form in the matrix columns. By assumption, there exists non-zero $v$ with $Av=0$, i.e., the column vectors are linearly dependent. Use this and alternating multilinearity to show that $\det A$ equals $\det B$ for a matrix $B$ with one column equal to $0$, and there we see from linearity in that column that $\det B=0$.

1
On

There are two basic operations you can do with matrices and their determinants: First, if any pair of columns (or rows) of a matrix is interchanged, the determinant is multiplied with −1. Second, if a scalar multiple of one column is added to another column, the value of the determinant is not altered. In this way, you can do Gauss elimination on the matrix and keep track of the determinant of the matrix. If the matrix is singular, you end up with a zero column (or row) giving you determinant 0.

2
On

The inverse exists when the determinant is nonzero. There is a formula for it: $$A^{-1}=\frac1{\operatorname{det}A}\operatorname{adj}A$$, where $\operatorname{adj}A$ is the transpose of the matrix of cofactors, the so-called adjugate of $A$.

0
On

I would use two proof

  1. Permutation change the sign of the determinant
  2. Multiplying a row by a constant multiply the determinant by this constant

Then, you can just say if you have a singular matrix S, 2 row $r_i$ and $r_j$ are linearly dependent. You can multiply $r_i$ by a constant C to get a new matrix S' where $r_i$ = $r_j$.

We know \begin{equation} det(S') = C det(S) \end{equation} Then, you can swap $r_i$ and $r_j$ but you have the same matrix has before so \begin{equation} det(S') = -det(S') = -C det(S) = C det(S) \end{equation} The equation det(S) = -det(S) means det(S) is 0.