Best way to show that these $3$ vectors are a basis of the vector space $\mathbb{R}^{3}$?

1.8k Views Asked by At

I'm writing a linear algebra exam soon and I'd like to know a fast way to solve a task like that:

$$u=\begin{pmatrix} 0\\ 0\\ 1 \end{pmatrix}, v= \begin{pmatrix} 1\\ 0\\ 1 \end{pmatrix}, w = \begin{pmatrix} 0\\ 1\\ 1 \end{pmatrix}$$

Show that $(u,v,w)$ is a basis of the vector space $\mathbb{R}^{3}.$

I would start by checking if these vectors are linearly independent. I do this by checking determinant $\neq 0$:

enter image description here

Using Sarrus rule, we know that determinant is $1$ and thus these vectors are linearly independent.

Now let $x,y,z \in \mathbb{R}$

$$\begin{pmatrix} x\\ y\\ z \end{pmatrix}=\lambda_{1}\begin{pmatrix} 0\\ 0\\ 1 \end{pmatrix}+ \lambda_{2}\begin{pmatrix} 1\\ 0\\ 1 \end{pmatrix}+ \lambda_{3}\begin{pmatrix} 0\\ 1\\ 1 \end{pmatrix}$$

$$x = \lambda_{2}$$

$$y = \lambda_{3}$$

$$z = \lambda_{1}+\lambda_{2}+\lambda_{3}$$

The solution is unique too and thus $(u,v,w)$ is a basis of the vector space $\mathbb{R}^{3}$


Did I solve the task correctly and can you tell me better ways if there are some?

4

There are 4 best solutions below

0
On BEST ANSWER

Once you have proved that the $3$ vectors are linearly independent, you automatically have that they are a basis for $\mathbb{R}^3$, since they generate a subspace with dimension $3$ of a space of dimension $3$ - so they must generate the entire space! As for proving linear independence, the determinant approach proposed in the question is general and works well.

In this particular case, a simpler approach is to see that $v-u=(1 0 0)^T$, $w-u=(010)^T$, $u=(001)^T$ form what is called the canonical basis of $\mathbb{R}^3$, so $u,v,w$ must also form a basis of $\mathbb{R}^3$.

0
On

You were done the second you proved they were l.i., since if you have n l.i. vectors of a given space of dimension n, then those vectors form a base.

2
On

Your answer is correct.

Any $3$ linearly independent vectors in a $3$-dimensional vector space are a basis for that vector space. You can check this, as you did correctly, by calculating that determinant. Notice that when you have a more complex $3$-dimensional vector space where vectors are for example functions, you can perform the same trick using the coordinates of those functions relative to a certain basis you do know. If the determinant of those coordinates is non zero, you have found a basis as well!

0
On

Computing a determinant will always work to determine linear independence, but in some cases you can determine this by inspection. By definition, a set of vectors is linearly independent if the only linear combination of them that sums to zero has all zero coefficients. Observe that there’s no way to generate a non-zero first component of a linear combination of $u$ and $v$, so the coefficient of $v$ in any linear combination that produces $0$ must be $0$. A similar observation shows that the coefficient of $w$ must also be zero, but that means that the coefficient of $u$ must be zero as well to eliminate the third component of the sum. As others have pointed out, at this point you’re done because by the definition of dimension, this set of three linearly independent vectors must span the entire space.