If we want to check if some vectors are linearly independent we could get their coordinates relatively to some basis we know and check if the vectors formed by these coordinates are linearly independent; nice and short - if $[ \{ v_{1},v_{2},...,v_{n} \} ]_{B_{1}}$ are linearly independent, where $B_{1}$ is some basis, then $ \{ v_{1},v_{2},...,v_{n} \}$ are independent.
I'm sorry for not being quite eloquent, but the truth is I'm not sure about this and I can't understand why it works. I will leave an example.
Let $\mathbb{P}_{2}$ be the space formed by all polynomials with degree two or less. Let's consider the following vectors: $$v_{1} = 1;\\ v_{2} = 2+x;\\ v_{1} = 3 + x^{2} $$ We want to check if they span the space, namely if they form a basis for $\mathbb{P}_{2}$.
So to solve this we actually start by building the coordinates vectors relatively to the standard $\mathbb{P}_{2}$ basis (B = {1, x, $x^{2}$}):
$[v_{1}]_{B} = \begin{bmatrix} 1\\ 0\\ 0\\ \end{bmatrix}, [v_{2}]_{B} = \begin{bmatrix} 2\\ 1\\ 0\\ \end{bmatrix}, [v_{3}]_{B} = \begin{bmatrix} 3\\ 0\\ 1\\ \end{bmatrix}$
If these vectors form a basis for $\mathbb{R}_{3}$$ ^{*}$, so does the original ones for $\mathbb{P}_{2}$. I can't understand why is that. I would really appreciate any answer in advance.
*to check that we would then proceed to build a matrix, let's called it A, with the columns being respectively $[v_{1}]_{B},[v_{2}]_{B} \ and \ [v_{3}]_{B}:$ $$A = \begin{bmatrix} 1&2 &3\\ 0&1 &0\\ 0&0 &1 \end{bmatrix}$$ and check if this matrix is invertible. Why do we check if the matrix is invertible?
Matrix A is invertible if and only if $Ax = 0$ implies $x=0$ (clearly $A^{-1}(Ax) =x$). Write $Ax$ out as $$\pmatrix{a_{1,1}&a_{1,2}&\cdots&a_{n,n}\\a_{2,1} & a_{2,2} &\cdots &a_{2,n}\\\vdots & \ddots & &\vdots\\\\a_{n,1}&a_{2,n}&\cdots&a_{n,n}}\pmatrix{x_1\\x_2\\\vdots\\x_n} =\\= x_1\pmatrix{a_{1,1} \\ a_{2,1} \\\vdots \\a_{n,1}} + x_2\pmatrix{a_{1,2} \\ a_{2,2} \\\vdots \\a_{n,2}} +\cdots+ x_n\pmatrix{a_{1,n} \\ a_{2,n} \\\vdots \\a_{n,n}}$$
and observe that $Ax$ is simply a linear combination of the columns of $A$. So $Ax$ is zero if and only if the linear combination of its columns according to coordinates of $x$ is zero. Do you see why the invertibility of $A$ is equivalent to the independece of its columns?
Furthermore, linear independence does not depend on the choice of basis. A set of vectors is linearly independent if and only if they are linearly independent in any other basis. Is that clear after seeing the above equivalence?
Edit: wrote it out.