Elegant way to prove that this vector set is linear independent.

540 Views Asked by At

I'm trying to find an elegant way to prove that this vector set $$\begin{bmatrix}1\\ 1\\1\\0\end{bmatrix}, ~~\begin{bmatrix}1\\ 1\\0\\1\end{bmatrix},~~\begin{bmatrix}1\\ 0\\1\\1\end{bmatrix},~~\begin{bmatrix}0\\ 1\\1\\1\end{bmatrix}$$ is linearly independent. I tried to find the matrix determinant (Where the column vectors are these vectors) through co factors expansion. Also, I tried to reduce the matrix to find which vectors are linearly independent based on the pivots. Is there another way to prove such linear independence?

4

There are 4 best solutions below

2
On BEST ANSWER

Let your vectors be $v_4, v_3, v_2, v_1$, consider the vector $$\frac{1}{3}\sum_{i=1}^4 v_i =\mathbb{1},$$

the all one vector.

notice that $$\mathbb{1}-v_i=e_i,i\in \left\{1,2,3,4\right\}$$

the standard unit basis, hence they must be linearly independent.

3
On

Use the standard method $av_1+bv_2+cv_3+dv_4=\vec{0}$. If all coefficients vanish, then the vectors are linearly independent:

$a+b+c=0$

$a+b+d=0$

$a+c+d=0$

$b+c+d=0$

Solving this system is pretty easy. You will see that $a=b=c=d=0$, hence the vectors are linearly independent.

A More tricky way is to see that multiplication with:

$$\left[ \begin {array}{cccc} 1&1&1&-2\\ 1&1&-2&1 \\ 1&-2&1&1\\ -2&1&1&1\end {array} \right] $$

will give a multiple of the identity matrix. Hence there exists an inverse which is the same as saying that all column vectors are linearly independent.

0
On

You can observe that, if $$ M=\begin{bmatrix} 1 & 1 & 1 & 0 \\ 1 & 1 & 0 & 1 \\ 1 & 0 & 1 & 1 \\ 0 & 1 & 1 & 1 \end{bmatrix} $$ then $$ M\begin{bmatrix}1\\1\\1\\1\end{bmatrix}= 3\begin{bmatrix}1\\1\\1\\1\end{bmatrix}, \qquad M\begin{bmatrix}1\\-1\\-1\\1\end{bmatrix}= -\begin{bmatrix}1\\-1\\-1\\1\end{bmatrix}, \qquad M\begin{bmatrix}0\\-1\\1\\0\end{bmatrix}= \begin{bmatrix}0\\-1\\1\\0\end{bmatrix}, \qquad M\begin{bmatrix}-1\\0\\0\\1\end{bmatrix}= \begin{bmatrix}-1\\0\\0\\1\end{bmatrix} $$ The last two vectors are obviously linearly independent. Thus $M$ has eigenvalues $3$, $-1$ and $1$ (double), so it is invertible.

However, row reduction is much easier than cofactor expansion: \begin{align} \begin{bmatrix} 1 & 1 & 1 & 0 \\ 1 & 1 & 0 & 1 \\ 1 & 0 & 1 & 1 \\ 0 & 1 & 1 & 1 \end{bmatrix} &\to \begin{bmatrix} 1 & 1 & 1 & 0 \\ 0 & 0 & -1 & 1 \\ 0 & -1 & 0 & 1 \\ 0 & 1 & 1 & 1 \end{bmatrix} && \begin{aligned}R_2&\gets R_2-R_1\\ R_3&\gets R_3-R_1\end{aligned} \\ &\to \begin{bmatrix} 1 & 1 & 1 & 0 \\ 0 & -1 & 0 & 1 \\ 0 & 0 & -1 & 1 \\ 0 & 1 & 1 & 1 \end{bmatrix} &&R_2\leftrightarrow R_3 \\ &\to \begin{bmatrix} 1 & 1 & 1 & 0 \\ 0 & -1 & 0 & 1 \\ 0 & 0 & -1 & 1 \\ 0 & 0 & 1 & 2 \end{bmatrix} &&R_4\gets R_4+R_2 \\ &\to \begin{bmatrix} 1 & 1 & 1 & 0 \\ 0 & -1 & 0 & 1 \\ 0 & 0 & -1 & 1 \\ 0 & 0 & 0 & 3 \end{bmatrix} &&R_4\gets R_4+R_3 \end{align} so the determinant is $1\cdot(-1)\cdot(-1)\cdot 3=3$.

0
On

Assume $v_i$ are dependent.

There are then $c_i$ such that $\sum c_i v_i = 0$ where $c_i$ not all zero. Then subtract $(\sum c_i) \cdot 1 $ from both sides where $1$ is the vector with entries all 1.

$\sum c_i e_i = (\sum c_i) \cdot 1$

Deduce that the $c_i$ are all equal (see the vector equality entry by entry) and therefore they must all be zero (contradiction).