If $n=\dim(V)$ and $n$ vectors are linearly independent, then they form a basis

11.3k Views Asked by At

If $V$ is a vector space and $v_1, v_2, . . . , v_n \in V$ span $V$, and $u_1, u_2, . . . , u_m ∈ V$ are linearly independent, then $m\le n$.

Use this to prove that if $V$ has dimension $n$ and $u_1, u_2, . . . , u_n \in V$ are linearly independent then $u_1, u_2,\le, u_n$ form a basis of $V$.

Do I prove that $V$ has a basis with n elements? Not really sure how to approach this proof.

5

There are 5 best solutions below

2
On

You need to show that $u_1,\dotsb,u_n$ span $V$. The easiest way to do this is to suppose that there is a vector $v\in V$ which is not in the span of $u_1,\dotsb,u_n$. Then $u_1,\dotsb,u_n,v$ is a linearly independent set. Then apply your result on the size of linearly independent sets. This will lead to a contradiction and therefore no such $v$ can exists

0
On

Hint. Assume that $u_1,\ldots,u_n$ DO NOT form a basis of $V$. Then, as they are linearly independent, they DO NOT span $V$. Hence, there exists a vector, say $u_{n+1}$ NOT spanned by the $u_1,\ldots,u_n$. This implies that $u_1,\ldots,u_n,u_{n+1}$ are linearly independent.

This apparently leads to a contradiction...

2
On

You have to argue by contradiction. Moreover you need that

Let $V$ be a vector space, and suppose $v_1, \dots, v_n$ are linearly dependent vectors spanning $V$. Then there exists $i$ such that $V$ is spanned by $ v_1, \dots, v_{i-1}, v_{i+1}, \dots v_n $ ($v_i$ disappears).

Now, since $V$ has dimension $n$ there exists a basis of $V$: $w_1, \dots, w_n$ . In particular $w_1, \dots, w_n$ are linearly independet.

Suppose by contradiction that $u_1, \dots, u_n$ is not a basis. By hypothesis they span $V$, so we are saying that they are linearly dependent. Then you have that $ u_1, \dots, u_{i-1}, u_{i+1}, \dots u_n $ span $V$ for some $i$, and they are $n-1$ vectors.

Now, applying the proposition you stated, $$\left\{ \begin{matrix} \mbox{$ u_1, \dots, u_{i-1}, u_{i+1}, \dots u_n $ span $V$} \\ \mbox{ $w_1, \dots, w_n$ are linearly independent }\end{matrix} \right. \Longrightarrow n-1 \le n$$ A contradiction.

2
On

Just for completeness, although it is certainly the simplest way to prove this result, you don't have to use proof by contradiction.

Instead, since you know that you have an $n$ dimensional vector space there must exist a set of $n$ basis vectors. Call them $\mathbf{b_1}, \dots, \mathbf{b_n}$. Since they are a basis, for each of your vectors $\mathbf{u_i}$ there exist coefficients $a_{ij}$ such that $$ \mathbf{u_1} = a_{11} \mathbf{b_1} + a_{12} \mathbf{b_2} + \dots + a_{1n} \mathbf{b_n} \\ \vdots \\ \mathbf{u_i} = a_{i1} \mathbf{b_1} + a_{i2} \mathbf{b_2} + \dots + a_{in} \mathbf{b_n} \\ \vdots \\ \mathbf{u_n} = a_{n1} \mathbf{b_1} + a_{n2} \mathbf{b_2} + \dots + a_{nn} \mathbf{b_n} $$ You then have $n$ linearly independent equations, which means that you can rearrange these to form new equations $$ \mathbf{b_i} = c_{i1} \mathbf{u_1} + c_{i2} \mathbf{u_2} + \dots + c_{in} \mathbf{u_n} $$ (where $c_{ij}$ are coefficients defined in terms of the $a_{ij}$'s). This means that you have found $n$ linearly independent vectors such that each of the basis vectors $\mathbf{b_i}$ can be written as a linear combination of them. It is a simple consequence of the definition of a basis that this is sufficient to show that $\mathbf{u_1}, \dots, \mathbf{u_n}$ span $V$, and are therefore a basis for $V$.

Aside: if this looks like matrix multiplication, that's because it is! This manoeuvre only works when the matrix is invertible, and the coefficients $c_{ij}$ are exactly the entries in the inverse matrix. If you are interested, try to work through the proof as I've written it, but where the $\mathbf{u_i}$ are not linearly independent, and see where it fails. It can also be instructive to pick a familiar vector space and work through this by hand: I recommend proving that $\left\{ \left(\matrix{1\\0}\right), \left(\matrix{1\\1}\right) \right\}$ is a basis for $\mathbb{R}^2$.

1
On

The “exchange lemma” by Steinitz says something more about your two sets $\{v_1,\dots,v_n\}$ and $\{u_1,\dots,u_m\}$ besides $m\le n$. Precisely, it says that you can replace $m$ elements in the first set with the elements in the second set, so that you still have a spanning set.

In case $m=n$, you replace all the vectors in the given spanning set, so the given linearly independent set is a basis.


The information $m\le n$ is not sufficient to conclude without using specific properties of vector spaces. For instance, you can prove that a linearly independent set in a finitely generated free abelian group must have at most as many elements as the rank of the group (that is, the number of elements in a basis), but it's not true that a linearly independent set with as many elements as the rank is a spanning set.

You can use the auxiliary result, for vector spaces, that every linearly independent set can be extended to a basis; since any two bases have the same number of elements, you're done.