Is there a quick way to determine if certain combinations of columns of a matrix are linearly independent? (finding basic feasible solutions)

354 Views Asked by At

This is in relation to finding all basic feasible solutions where $A =$$$\begin{pmatrix} 1 & 2 & 2 & 1 & 0 \\ 1 & 2 & 1 & 1 & 1 \\ 3 & 6 & 2 & 1 & 0 \end{pmatrix}.$$

and $b =$ $$\begin{pmatrix} 12\\ 18 \\ 36 \end{pmatrix}.$$

I know that to find the BFS, we must first all determine sets of $m = 3$ columns that are linearly independent. I am aware that they will be linearly independent if I can reduce the matrix consisting of chosen 3 column vectors to the identity matrix. However, this is rather painstaking, especially under exam conditions. Is there any other way to determine these?

3

There are 3 best solutions below

0
On

First we can acknowledge that since the matrix is not square we are guaranteed linear dependence. One way to determine which sets of vectors are dependent would be to find $3\times 3$ subdeterminants of your matrix. Of course this can be computationally intensive too if you're trying to find all linearly dependent sets since this would require ${5\choose 3} = 10$ computations in this particular instance. The potential advantage is that if you're only looking for one set of linearly dependent vectors, you may be able to stop early since it's unlikely that you would need to exhaust all possibilities before finding a linearly dependent set.

0
On

Elementary row operations don't change the linear relations between the columns; so we can find the reduced row echelon form: $$ \begin{pmatrix} 1 & 2 & 2 & 1 & 0 \\ 1 & 2 & 1 & 1 & 1 \\ 3 & 6 & 2 & 1 & 0 \end{pmatrix} \to \begin{pmatrix} 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & -1 & 0 & 1 \\ 0 & 0 & -4 & -2 & 0 \end{pmatrix} \to \begin{pmatrix} 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & 1 & 0 & -1 \\ 0 & 0 & 0 & -2 & -4 \end{pmatrix} \to \begin{pmatrix} 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & 1 & 0 & -1 \\ 0 & 0 & 0 & 1 & 2 \end{pmatrix} \to \begin{pmatrix} 1 & 2 & 2 & 0 & -2 \\ 0 & 0 & 1 & 0 & -1 \\ 0 & 0 & 0 & 1 & 2 \end{pmatrix} \to \begin{pmatrix} 1 & 2 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & -1 \\ 0 & 0 & 0 & 1 & 2 \end{pmatrix} $$ If we denote by $a_1,\dots,a_5$ the columns of $A$, we can observe that the sets $$ \{a_1,a_3,a_4\},\quad \{a_1,a_3,a_5\},\quad \{a_1,a_4,a_5\},\quad \{a_2,a_3,a_4\},\quad \{a_2,a_3,a_5\},\quad \{a_2,a_4,a_5\} $$ are (maximal) linearly independent.

0
On

Look for columns that are multiples of each other and for simple linear combinations of columns. For instance, it’s immediately obvious that the second column is twice the first, and slightly less obvious that the middle column is a linear combination of the last two.