Consider the matrix representing 6 linearly dependent vectors: $$\left(\begin{array}{llllll} 1 & 0 & 0 & 0 & 0 & 1 \\ 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 & 0 & 1 \\ 0 & 0 & 0 & 1 & 0 & 1 \\ 0 & 0 & 0 & 0 & 1 & 1 \end{array}\right)$$ I know how to prove that the vectors in this matrix are linearly dependent, but how can I show (concisely) that removing any one of the vectors we get a linearly independent set?
Proving that removing any vector of the linearly dependent set gives a linearly independent set
1.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
Note that the given matrix is in its Row Reduced Echelon Form and has rank 5. That is, its columns span $\mathbb R^5$.And basis of $\mathbb R^5$ can only have 5 vectors. There can not be more than 5 linearly independent vectors in $\mathbb R^5$. Now, coming to the given matrix, remove any column (say 3rd column), then you have $5$ by 5 matrix (rearrange the columns) in row reduced form with 5 pivots, hence the columns of this matrix are linearly independent. Same conclusion if you remove any other column from the given matrix. Hence, removing any column from the matrix (given in your post) results in a full rank square matrix.
On
You can build matrices excluding one column at a time.
If you call the original matrix $M$, and the matrix excluding column $j$ for $M_j$, then you can build a sparse matrix performing the exclusion, finding a factorization for $M_j$ in terms of $M$ like this:
$$M_j = ME_j$$
Where the $E_j$ matrices are truncated $I$(identity) matrices having row $j$ cut out.
Now to express what part of a random vector $r$ $M_j$ can capture, we can calculate
$$t_j = M_j({M_j}^{\dagger})r$$
where $\dagger$ is some suitable pseudo-inverse, for example Moore-Penrose.
If all $t_j$ are the same, then each vector is linearly dependent of all the others.
But if one is not, then it means that excluding that one loses some information.
An example when that happens is here:
$$\begin{bmatrix}0&0&0\\0&1&1\\1&1&1\end{bmatrix}$$
Excluding any one of column 2 or 3 won't make a difference, since they are the same.
But if we exclude 1, then we will lose information.
If you try it you shall find $t_2\approx t_3 \neq t_1$
Here the fastest method I can think of.
If (for example) the first 5 columns are linearly dependent, you can find a vector in the right kernel of the form $$\left[ \begin{array}a a\\ b\\ c\\ d\\ e\\ 0 \end{array}\right], $$ that is, a vector that ends with $0$. In general if you leave out the $i-th$ column and the rest are linearly dependent, you find a vector in the right kernel with a $0$ in the $i$-th position.
Since the only vectors in the right kernel of your matrix are the multiples of $$\left[ \begin{array}a 1\\ 1\\ 1\\ 1\\ 1\\ -1 \end{array}\right] $$ and it has no zeros, you can deduce that any 5 columns are linearly independent.