Let $V$ be an $n$-dimensional vector space. Then any linearly independent set of vectors $\{v_1, v_2, \ldots, v_n\}$ is a basis for $V$.
Proof:
Suppose $B =\{v_1, v_2, \ldots, v_n\}$ do not span $V$. Then there's $w \in V$ s.t. $\displaystyle{w = \sum_{i = 1}^nk_iv_i + k_{n + 1}v_{n+ 1}}$. Now $C = \{v_1, v_2, \ldots, v_n, v_{n + 1}\}$ is linearly dependent and so we can write $w_{n + 1}$ in terms of preceding vectors: $\displaystyle{v_{n + 1} = \sum_{i = 1}^nc_iv_i}$. Thus $\displaystyle{w = \sum_{i = 1}^nk_iv_i + k_{n + 1}\sum_{i = 1}^nc_iv_i = \sum_{i = 1}^n(k_i + k_{n + 1}c_i)v_i}$ meaning $B$ spans $V$.
I have a question about the dependence of $C$. Suppose $B$ spans $V$. Then since $v_{n + 1} \in V,$ we have that $\displaystyle{v_{n + 1} = \sum_{i = 1}^nk_iv_i}$ meaning $\displaystyle{\sum_{i = 1}^nk_iv_i + (-1)v_{n + 1}} = \vec0$ which implies $C$ is linearly dependent. In other words, for $C$ to be linearly dependent, we want $B$ to span $V$. So how do they get $C$ to be dependent in the proof above?
It goes back to the definition of dimension of a vector space.
When we say the space in n dimensional, we mean that every set of linearly independent vectors have at most $n$ elements.
Now if you have $n+1$ vectors one of them is a linear combination of others.