Linearly dependent matrix columns implies zeroes in the solution of the system of equation given by this matrix?

69 Views Asked by At

Let's assume that we have a system of equations:

$a_{11} x_1 + ... + a_{1n} x_n = 0$
$a_{21} x_1 + ... + a_{2n} x_n = 0$
$......................$
$a_{m1} x_1 + ... + a_{mn} x_n = 0$

Let us denote by $C=\{C_{i_1}, C_{i_2}, ..., C_{i_w}\}$ a maximal linearly independent set of columns of the matrix $M$ of the above equation system.
Let us denote $I:=\{{i_1}, {i_2}, ...., {i_w}\}$.
Is there a theorem saying that there exists a solution to the above system such that these $x_i$-s for which $i \notin I$ are equal to zero? Also, if yes, how to prove it and is it actually just "there exists" or in fact it will be the case for every solution of the presented system?
I'm asking this question because I saw this fact apparently being used in a proof of the theorem that for every matrix its column rank is equal to its row rank, however it wasn't justified.

Remark: In fact, in proof it's assumed that rows of the matrix $M$ are linearly independent, but I don't think that it's relevant to the question.

1

There are 1 best solutions below

0
On BEST ANSWER

Since the $C_i, i\in I$ form an independent set of vectors, there is exactly one solution to the system such that the other $x_i \,(i\not\in I)$ are all zero, and that is the trivial solution (with $x_i=0$ for all $i$).

It is not really a theorem, it is a straightforward consequence of the definition of linearly independent set.

[Note: the fact that the rows of $M$ are linearly independent tells you that $\omega$ is in fact equal to $m$ - but indeed, that's not particularly relevant to your question.]