How to understand and to prove that any $n$ linearly independent vectors form a basis of $R^n$ space conceptually?

1.1k Views Asked by At

I know how to use methods like system of linear equations to show that any vector in $R^n$ can be expressed in the form of unique linear combinations of a given set of n linearly independent vectors, but how to expand this result to that any given set of exactly n linearly independent vectors must form a basis of $R^n$ space? How to understand it conceptually?

I saw there are similar questions, but the answers in those posts do not really convince me.

Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

I'm not sure if this answers your question but in order for a set of vectors to be a basis of a vector space it has to be linear independent and has to span the entire space. Now if you're given a set of $n$ linearly independent vectors in $\mathbb{R}^n$ you only have to show that it spans $\mathbb{R}^n$. But as you said yourself this is indeed the case since you can express any vector as a linear combination of vectors from your set.

If that isn't satisfactory, you could also think about it this way: Suppose you have $n$ linearly independent vectors $(v_1, \dots , v_n)$ and another vector $w$ not in the span of your $(v_1, \dots , v_n)$. Then $(v_1, \dots , v_n,w)$ is linear dependent since it's a set of $n+1$ vectors in a vector space of dimension $n$. This means there are $\mu, \lambda_i \in \mathbb{R}, i=1,\dots,n$ not all $0$ with $0=\mu w + \sum_{i=1}^{n} \lambda_i v_i$. Now $\mu \neq 0$ because otherwise $(v_1, \dots , v_n)$ would be linear dependent. Therefore $w=-\frac{1}{\mu} \sum_{i=1}^{n} \lambda_i v_i$ but this means $w$ is in the span of $(v_1, \dots , v_n)$, a contradiction.