Prove that in $\mathbb{R}^2$ it holds: {$(a,c), (b,d)$} is linearly independent $\iff ad - bc \neq 0$

205 Views Asked by At

Prove that in $\mathbb{R}^2$ it holds: {$(a,c), (b,d)$} is linearly independent $\iff ad - bc \neq 0$


Forward direction ($\implies$) :

Linearly independent means the coefficients $x$ and $y$ have to be zero for $$x(a,c) + y(b,d) = (0,0)$$ Which can be rewritten as $$(xa + yb, xc + yd) = (0,0)$$

If $x$ and $y$ would not be zero (so not linearly independent), we could have with the first equation: $$y = - \frac{a}{b}x$$

Substituting in the second equation would give: $$x(c - \frac{a}{b}d) = 0$$

Because we assume $x$ to be non-zero, we have $$c - \frac{a}{b}d = 0 \iff ad - bc = 0$$

First, is this correct until now ? Second, how do I prove it in the other direction ? Like how do I prove that $ad - bc \neq 0$ implies the vectors are linearly independent ?

4

There are 4 best solutions below

0
On BEST ANSWER

$(\Rightarrow)$ We shall use proof by contraposition. Assume $ad-bc=0$. We prove that $\{(a,c),(b,d)\}$ is linearly dependent. Note that $-d(a,c)+c(b,d)=(-da,-dc)+(cb+cd)=(-da+cb,0)=0$, because $ad-bc=0$. Therefore, $\{(a,c),(b,d)\}$ is linearly dependent.

$(\Leftarrow)$ If $\alpha(a,c)+\beta(b,d)=0$, then \begin{cases} \alpha a+\beta b=0 \ \ \ \ \ \text{(1)} \\ \alpha c+\beta d=0 \ \ \ \ \ \text{(2)} \end{cases}

Multiplying the first equation by $c$ and the second one by $a$, we have \begin{cases} \alpha ac+\beta bc=0 \\ \alpha ac+\beta ad=0 \end{cases}

Subtracting the equations, we have $\beta(bc-ad)=0\Rightarrow\beta=0$, because $ad-bc\neq0$.

On the other hand, multiplying the Eq. (1) by $d$ and Eq. (2) by $b$, we have \begin{cases} \alpha ad+\beta bd=0 \\ \alpha bc+\beta bd=0 \end{cases}

Subtracting the equations, we have $\alpha(ad-bc)=0\Rightarrow\alpha=0$, because $ad-bc\neq0$.

Therefore, if $\alpha(a,b)+\beta(b,d)=0$ and $ad-bc\neq0$ , then $\alpha=\beta=0$. Therefore, $\{(a,c),(b,d)\}$ is linearly independent.

2
On

Simple proof: From linear algebra, you may know that the set $\left\{(a,c),(b,d)\right\}$ is linearly independent if and only if the determinant of the matrix $$\begin{pmatrix} a & b \\ c & d \end{pmatrix}$$ is non-zero.

0
On

Please use this idea. $\Rightarrow$ Assume $\{(a,c), (b,d)\}$ is linearly independent. Assume also $ad-bc \neq 0$. As you said (if we assume $(xa,xc)+(yb, yd)=(0,0)$) then $xa+yb=0$ and $xc+yd=0$. So, if $y$, $c$ and $a$ are non-zero, $(x/y)=-(b/a)$ and $(x/y)=-(d/c)$. Equating, we note $\frac{d}{c}=\frac{b}{a}$ or $ad-bc=0$. This is against our initial assumption. We have to be careful and modify the argument if $y=0$ ((or $c=0$) and so on). But it would be similar.

Use this idea for $\Leftarrow$.

Assume $ad-bc \neq 0$. Assume $\{(a,b),(c,d)\}$ is linearly dependent (seeking a contradiction). Then (for some non-zero $\omega$), $(a,b)=(c\omega, d\omega)$. Hence $ad=cd \omega$ (where $a$ has been replaced by $c\omega$ and $bc=cd\omega$ (where $b$ has been replaced by $d\omega$). Subtract, $ad-bc=\omega(cd-cd)=0$. But we initially assumed $ad -bc$ is non-zero. Thanks.

In this types of proofs, proof by contradiction is usually a nice idea.

0
On

For the $\Rightarrow$ side, it is correct.
For the $\Leftarrow$ side, maybe you can try contradiction by assuming $(a,c)$ and $(b,d)$ is linear dependent to show in that situation $ad-bc = 0$.