Why is dimension of solution space of homogeneous equations n-r?

4.5k Views Asked by At

This throws me off track completely - its like pushing me out of moving train.I am referring to page 65 of Shilov (Linear algebra).

The author clearly states that in Homogeneous system of linear equations:

  1. If the coefficient matrix has order k x n (k -> number of equations, n -> number of unknowns)
  2. r is rank of the matrix.

Then linear solution space has dimension n-r.

I would think if rank is r, then the number of linearly independent rows is r. So if xi denotes any solution for r+1st equation to kth equation, it should be totally describable by linearly independent solutions ri from i=1 to i=r (Again going by rank).

Yet, the dimension of solution space L is given by n-r.

Why?

Update: What's being said is starting to makes sense. I am probably mixing up the concept of linear independence of columns of basis minor matrix (rank r) with dimension of solution space.

I guess the fact that first r columns of coefficient matrix of rank r are linearly independent implies that other columns r+1 .... n is expressible in terms of r columns.

That gives us freedom to arbitrarily choose solutions cr+1...n for dependent column variables with solutions c1...r for linearly independent column elements uniquely determinable (By Cramers rule). This "freedom" manifests as dimension n-r of solution space. Does the above sound coherent?

2

There are 2 best solutions below

5
On

Have you tried an example, say a single equation (not identically zero) with 100 unknowns? I hope you agree that there are 99 free parameters in the solution in that case, not just one. (All variables can do what they want, except that one of them will have to adapt its value to make sure the equation is satisfied.)

1
On

Hans's remark is salient. The solution space is in some sense the ``degrees" of freedom you have in choosing your variables, or the dimension of the subspace in which solutions (for example $ax+by=0$, has en intire line of solutions, granted that $a,b\neq 0)$.

Here is another way to think about it, from a more linear-algebraic point of view. Depending on your background, it may be helpful.

A matrix $n \times k$ can be thought of as a linear transformation $\mathbb R^n \to \mathbb R^k$, in the sense that it takes as input $n$ vectors (variables) and outputs a $k$-dimensional vector.

The rank of a matrix is the dimension of the subspace that the image spans. In other words, $\mathbb R^n \to \mathbb R^k$ may not be surjective, all of the outputs could potentially lay on the same line, as in the case of $$A:= \begin{pmatrix}1 & 0\\0 & 0 \end{pmatrix}$$

So, the solutions to a bunch of equations are asking when is $$Ax =0$$ where $x \in \mathbb R^n$ and $0 \in \mathbb R^k$. The Rank Nullity theorem tells us that $\dim \ker A=n- \dim\mathrm{Im}A=n-r$