Are the following vectors linearly independent?

261 Views Asked by At

Given that the vectors $v_{1}, v_{2}, v_{3}, v_{4} $ are linearly independent.

Are the following $b_{1}, b_{2}, b_{3}$ vectors linearly independent too?

$b_{1}= 3v_{1} + 2v_{2} + v_{3} + v_{4} $

$b_{2}= 2v_{1} + 5v_{2} + 3v_{3} + 2v_{4} $

$b_{3}= 3v_{1} + 4v_{2} + 2v_{3} + 3v_{4} $

What I've done so far

$k_{1}(3v_{1} + 2v_{2} + v_{3} + v_{4})+k_{2}(2v_{1} + 5v_{2} + 3v_{3} + 2v_{4})+k_{3}(3v_{1} + 4v_{2} + 2v_{3} + 3v_{4}) = 0$

then I don't know what to do. I don't even know whether the $0$ above is supposed to be a vector or a scalar.

I’ve solves this kinds of problems with separate vectors but I don’t know how to solve this one, could you please help?

Thank you.

4

There are 4 best solutions below

0
On BEST ANSWER

Result: Let $\alpha_{ij}$; $1\leq i, j\leq n$ be scalars such that the system of equations:

$$\sum\limits_{j=1}^n x_j\alpha_{ij}=0,\qquad 1\leq i\leq n$$

has only trivial solution. Let $\{v_1, v_2,\dots, v_n\}$ be any linearly independent set of vectors. Then $\{\sum\limits_{i=1}^n \alpha_{i1}v_i, \sum\limits_{i=1}^n \alpha_{i2}v_i, \dots, \sum\limits_{i=1}^n \alpha_{in}v_i\}$ is also linearly independent.

Proof Suppose on the contrary that that $\{v_1, v_2,\dots, v_n\}$ is linearly independent and $\{\sum\limits_{i=1}^n \alpha_{i1}v_i, \sum\limits_{i=1}^n \alpha_{i2}v_i, \dots, \sum\limits_{i=1}^n \alpha_{in}v_i\}$ is linearly dependent. Then $\exists$ scalars $c_1,c_2, \dots, c_n$ (not all zero) such that

$$c_1\sum\limits_{i=1}^n \alpha_{i1}v_i+c_2\sum\limits_{i=1}^n \alpha_{i2}v_i+\dots +c_n\sum\limits_{i=1}^n \alpha_{in}v_i=\bf{0}.$$

Since $\{v_1, v_2,\dots, v_n\}$ is linearly independent, we must have

$$\sum\limits_{j=1}^n c_j\alpha_{ij}=0,\qquad 1\leq i\leq n,$$

which is a contradiction.

The similar technique can be applied if the "sum" set has cardinality lesser that $n$.

0
On

Hint:

To show they're independent, you have to show the matrix $$\begin{bmatrix} b_1&b_2&b_3 \end{bmatrix}=\begin{bmatrix} 3&2&3\\[-1ex] 2&5&4\\[-1ex]1&3&2 \\[-1ex]1&2&3\end{bmatrix}$$ has maximal rank ($3$), which means thereis a subterminat of order $3$ which is $\ne 0$.

0
On

What you did so far is good. What you need to do next is to collect the like terms and simplify the equation. Since the vectors $\vec{v_i}, 1\le i \le 4$ are linearly independent, then the coefficients of $\vec{v_i}$ must be all zero.

For example, the coefficient of $\vec{v_1}$ must be zero: $3k_1+2k_2+3k_3=0$. Same for the coefficients of the other basis vectors.

The linear system has 3 unknown quantities $k_1, k_2, k_3$ and four equations. If by solving it you obtained any solution different than $k_i=0, 1\le i \le 3$ The the vectors are linearly dependent.

0
On

You have $k_1(3v_1+ 2v_2+ v_3+ v_4)+ k_2(2v_1+5v_2+ 3v_3+ 2v_4)+ k_3(3v_1+ 4v_2+ 3v_3+ 2v_4)= 0$. That is a sum of vectors multipkied by scalars. Adding two vectors results in a vector and multipying a vector by a scalar results in a vector. The "0" is the 0 vector.

You also should know that the basice "rules of arithmetic", specifically the distributive and commutative laws, apply vector arithmetic. The above can be rewritten as $3k_1v_1+ 2k_1v_2+ k_1v_3+ k_1v_4+ 2k_2v_1+5k_2v_2+ 3k_2v_3+ 2k_2v_4+ 3k_3v_1+ 4k_3v_2+ 3k_3v_3+ 2k_3v_4= 0$ and then as $(3k_1+ 2k_2+ 3k_3)v_1+ (2k_1+ 5k_2+ 4k_3)v_2+ (k_1+ 3k_2+ 3k_3)v_3+ (k_1+ 2k_2+ 2k_3)v_4= 0$.

Since we are given that $v_1$, $v_2$, $v_3$, and $v_4$ are independent, each of those coefficients must be 0. We must have $3k_1+ 2k_2+ 3k_3= 0$ $2k_1+ 5k_2+ 4k_3= 0$ $k_1+ 3k_2+ 3k_3= 0$ and $k_1+ 2k_2+ 2k_3= 0$.

Obviously, $k_1+ k_2+ k_3= 0$ satisfy those. Are there any other "non-trivial" solutions?