How to proceed : Show $\det A=0$ if two rows or columns are identical

1.2k Views Asked by At

$A$ is an $n \times n$ matrix. I want to show it has $\det A=0$, if two rows or columns are identical, without invoking linear dependency or the like.

I was looking at a $3\times 3$ matrix, and noticed that if the second and third columns of the matrix were identical, then each term in the sum had an additive inverse - therefore $\det A=0$. This is all according to the definition $$\det A=\sum_{b\in S_n}(-1)^Pa_{1(b)}a_{2(b)}...a_{n(b)}$$

So for an $n\times n$ matrix, we would have a sum with $n!$ terms and each pair of permutations (columns $n-1$, $n$ are identical) $b_1=(1,2,3,...n-1,n)$, $b_2=(1,2,3,...n,n-1)$ would just be equals with a different sign, since they require a different number of transpositions $P$ to construct. A difference of one, that is, and so they have opposite signs. Is it possible to construct the proof using this idea? What is the problem?

2

There are 2 best solutions below

0
On BEST ANSWER

Let $n\ge2$, and suppose columns $i$ and $j$ are the same for some $1\le i<j\le n$.

Note that $A_n$, the group of even permutations, has two left-cosets: $A_n$ and $(ij)A_n$.

Note also, that for any $\sigma\in A_n$, if $\pi=(ij)\sigma$, then we can show that

$$a_{1,\pi_1}\cdot a_{2,\pi_2}\cdot\ldots\cdot a_{n,\pi_n}=a_{1,\sigma_1}\cdot a_{2,\sigma_2}\cdot\ldots\cdot a_{n,\sigma_n}.$$

The idea is that $\pi$ behaves just like $\sigma$ except that if $\sigma$ sends $k\mapsto i$, then $\pi$ sends $k\mapsto j$, and vice versa. But in each case the above products are the same, since columns $i$ and $j$ are the same.

Hence we have that

$$\begin{align*} \det(A) &= \sum_{\sigma\in S_n}(-1)^{\text{sign}(\sigma)}a_{1,\sigma_1}\cdot a_{2,\sigma_2}\cdot\ldots\cdot a_{n,\sigma_n} \\ &= \sum_{\sigma\in A_n}a_{1,\sigma_1}\cdot a_{2,\sigma_2}\cdot\ldots\cdot a_{n,\sigma_n}+\underset{\sigma\in A_n}{\sum_{\pi=(ij)\sigma}}(-1)\cdot a_{1,\pi_1}\cdot a_{2,\pi_2}\cdot\ldots\cdot a_{n,\pi_n} \\ &= \sum_{\sigma\in A_n}\left(a_{1,\sigma_1}\cdot a_{2,\sigma_2}\cdot\ldots\cdot a_{n,\sigma_n}-a_{1,\sigma_1}\cdot a_{2,\sigma_2}\cdot\ldots\cdot a_{n,\sigma_n}\right)=0.\end{align*}$$

A similar proof that works when two rows are the same can be recovered from this one. Alternatively, you could use the above determinant definition to show that $\det(A^T)=\det(A)$.

0
On

This result is in the very definition of the determinant (relative to a basis) for a vector space $V $ of dimension $n$ with basis $\mathcal B=(e_1,e_2,\dots, e_n)$ over a field $K$: it is the $n$-linear alternating form on $V^n$, $f(v_1,v_2,\dots,v_n)$, which takes the value $1$ on the basis.

It can be shown that all alternating forms on $V_n$ are scalar multiples of this form.

Alternating means precisely that if any two vectors $_i$ and $v_j$ are equal, the form takes the value $0$. What you specify as the definition of the determinant is just a consequence of being alternating and $n$-linear, thereby giving away to calculate explicitly the determinant.