Cramer's rule and linear dependence/independence test

1.3k Views Asked by At

When you have the system of equations: $$ax + by = e\\cx + dy = f$$ And you do some row operations to eliminate $y$, we get:

$$x = \frac{ed-bf}{ad-bc}\tag{1}$$

If we eliminate $x$ we get:

$$y = \frac{af-ec}{ad-bc}$$ This will always work unless $ad-bc=0$, wich would result in division by zero. So the system will always have a solution, if $ad-bc\neq0$ wich is defined as the determinant of the matrix:

$$ \begin{bmatrix} a & b\\ c & d \\ \end{bmatrix}$$ The first question is: if the determinant is non-zero, shouldn't the system always have a unique solution? Because, the row operations are all valid, unless if $ad-bc=0$.

The same can be done to $3 \mbox{ by } 3$ matrices, and we'll get a different determinant, and a different solution for $x, y \mbox{ and } z$

So, can the same be said for the $3$ by $3$ case? The system will always have a unique solution if the determinant is non-zero?

I'm asking this because my book gives me the determinant linear independence/dependence test, but do not prove it. So I'm trying to prove it:

$(\vec u, \vec v, \vec w)$ L.I. $\iff$ $(\alpha \vec u + \beta\vec v + \gamma\vec w = \vec 0 \implies \alpha = \beta = \gamma = 0) $

or

$(\vec u, \vec v, \vec w)$ L.I. $\iff$ $\alpha (x_1, y_1, z_1)+ \beta(x_2, y_2, z_2) + \gamma(x_3,y_3,z_3) = \vec 0 \implies \alpha = \beta = \gamma = 0) $

Wich is the same as saying the system:

$$\begin{cases} \alpha x_1 + \beta x_2 + \gamma x_3 = 0\\ \alpha y_1 + \beta y_2 + \gamma y_3 = 0\\ \alpha z_1 + \beta z_2 + \gamma z_3 = 0\end {cases}$$ has the unique solution $ \alpha = \beta = \gamma = 0$.

But there are cases where the determinant of the system is $0$, but the system still have solutions. So, back to $(1)$: why there are cases where the determinant is $0$, but the system has solutions? Shouldn't the row operations (wich implies the general solution $x = \frac{ed-bf}{ad-bc}$) solve all systems? Where is the inconsistency?

2

There are 2 best solutions below

0
On

Firstly, if your determinant is non-zero, this means that the matrix is invertible. Hence

$$Ax = b \rightarrow x = A^{-1}b$$

So we have a unique solution.

If you are in the case where $b=0$, then you always have a solution regardless of what $A$ is, namely $x=0$. So for checking linear dependence, we want this to be the only solution, which boils down to checking that the determinant is non-zero.

1
On

If $(ad-bc) $ is zero, the lines are parallel, the linear equations are dependent so inconsistent, so has no solution. In this case let

$$ \frac{a}{b}= \frac{c}{d} = m$$

The two lines have the same slope, so are parallel, have a different y-intercepts like in:

$$ y=\frac{e}{b}-mx$$

and

$$ y=\frac{f}{d}-mx$$

enter image description here

A row or column in the LHS matrix can be added to another row or column multiplied with an arbitrary factor...resulting in a zero determinant.

When non-zero, there is always a solution.. at the point of intersection of the two lines.