I need to a maximal set of linearly independent columns of a matrix $A$. I've googled a lot and seen various solutions, but none of them seem to work for me. What I've seen so far is
1.- Using Cauchy-Schwarz inequality. But everything in this line simply finds non-parallel vectors, which is not what I need.
2.- Using the QR decomposition. The suggested method is to compute $R$ in $A = QR$, and check the non-zero entries on the diagonal. This seems to work just fine when $m \geq n$, but in my case I have $m < n$, and apparently it doesn't work. For example, if
$$ A = \left[\begin{array}{ccccc} 0 & 0 & 4 & 1 & 1\\ 1 & 2 & 1 & 1 & 1\\ 2 & 4 & 3 & 1 & 2 \end{array} \right] $$
$A$ is of rank 3, but computing $R$ yields
$$ R = \left[\begin{array}{ccccc} -2.23 & -4.47 & -3.13 & -1.34 & -2.23] \\ 0& 0 & -2.18 & -0.04 & -0.44 \\ 0& 0& -3.37 & -1.09 & -0.89& \end{array} \right] $$
I don't see how would I extract 3 l.i columns from this.
Any suggestion for another method? Thanks in advance!
Elementary row operations change the column space but do not alter the linear dependences among the columns. Thus if one column is a certain linear combination of the others before the row operations, then it is that same linear combination of them afterwards.
When you're done reducing, you have something like this: $$ \left[ \begin{array}{cccccccccc} a & \cdots & \bullet & \cdots & \bullet & \cdots \\ 0 & \cdots \\ \vdots \\ 0 & \cdots & b & \cdots & \bullet & \cdots \\ 0 & \cdots & 0 & \cdots\\ \vdots \\ 0 & \cdots & 0 & \cdots & c & \cdots \\ 0 & \cdots & 0 & \cdots & 0 & \cdots \\ \vdots & & \vdots & & \vdots \end{array} \right] $$ The columns with the pivot elements $a,b,c,\ldots$ are a maximal linearly indepedent set of columns of the reduced matrix, for reasons that you can show as an exercise.