Rank of vectors

5k Views Asked by At

Prove that the rank of a system of vectors from $E^n$ does is not bigger than the dimension of the vectors. For example the vectors $a,b,c$ are from $E^n$ so each of them has $n$ components (the vector $a=(a_{11},a_{21},...,a_{n1})$) so the rank $r$,of this system is not bigger than $n$,it is $r\le n$. I need to prove this. I was thinking that maybe I can prove that the max number of linear dependent vectors in $E^n$ is $n$, so the rank can not be bigger than $n$, but still this is not a theorem.

2

There are 2 best solutions below

2
On

$\text{Claim: }$ It is not possible to have more than $n$ linearly independent vectors in $E^n$.

$\text{Proof: }$ For a contradiction, suppose we have $m>n$ linearly independent vectors in $E^n$. Let $A$ be the $n \times m$ matrix formed by using these vectors as its columns. Then the rank of $A$ is $m$ by definition. However, $A$ is a $n \times m$ matrix, so its row echelon form can have at most $n$ pivot columns. This implies that its rank is less than or equal to $n$. That is, $m<n$. This is a contradiction, so we conclude that we cannot have more than $n$ linearly independent vectors in $E^n$.

As you stated in your question, the statement you wish to prove follows from the above result. If you have any questions feel free to post them in the comments. Hope this helps!

2
On

Not a proof, but a simple example that I hope can clarify the problem. I use vectors in $\mathbb{R}^2$ to make more simple the calculations. Given two not null vectors $$\vec a= \left[ \begin {array}{ccccc} a_1 \\ a_2 \end {array} \right] $$ and $$\vec b= \left[ \begin {array}{ccccc} b_1 \\ b_2 \end {array} \right] $$ suppose that the $a_1 \ne 0$ and $b_1 \ne 0$ ( since the vectors are not null at least one component must be not null). What means that they are linearly dependent? That we can find two numbers $x ,y$ not all zero, such that $ x\vec a+y\vec b=0$, i.e. $$ x \left[ \begin {array}{ccccc} a_1 \\ a_2 \end {array} \right] +y \left[ \begin {array}{ccccc} b_1 \\ b_2 \end {array} \right]= \left[ \begin {array}{ccccc} xa_1+yb_1 \\ xa_2+yb_2 \end {array} \right]=0 $$ this is equivalent to the system: $$ \begin{cases} a_1x+b_1y=0\\ a_2x+b_2y=0 \end{cases} $$ solving the first equation for $y$, substituting in the second equation and solving for $x$ you find: $$ \begin{cases} y=-\dfrac{a_1x}{b_1}\\ x(a_2b_1-b_2a_1)=0 \end{cases} $$ so you see that if $(a_2b_1-b_2a_1)\ne 0$ you must have $x=y=0$ , so:

$$ (a_1b_2-b_1a_2)\ne 0 $$

is the condition for the two vectors to be linearly independent.

Now suppose that $\vec a$ and $\vec b$ be linearly independent, I want to show that for any third vector $$ \vec c= \left[ \begin {array}{ccccc} c_1 \\ c_2 \end {array} \right] $$ with $c_1 \ne 0$ the three vectors $\vec a,\vec b,\vec c$ are linearly dependent. For this we have to show that there exists three numbers $x,y,z$, (not all $=0$) such that $x\vec a+y\vec b+z\vec c=0$. And this give the system: $$ \begin{cases} a_1x+b_1y+c_1z=0\\ a_2x+b_2y+c_2z=0 \end{cases} $$ solving the first for $x$, substituting and solving for $y$ we find: $$ \begin{cases} x=-\dfrac{c_1z+b_1y}{a_1}\\ y=\dfrac{z(a_1c_2-a_2c_1)}{a_2b_1-a_1b_2} \end{cases} $$ if $(a_1c_2-a_2c_1) \ne0$ the vectors $\vec a$ and $\vec c$ are linearly independent and in this case, for $z\ne 0$ we can always find $y \ne 0$ and $y \ne 0$ and the identity defining linear dependence of the three vectors is satisfied. Otherwise $\vec a$ and $\vec c$ are linearly dependent .

As noted at the beginning this is not a proof. To extend this reasoning to vectors in $\mathbb{R}^n$ we needs the properties of matrices as show in the answer of Gecko, but the meaning of those properties is not so different.