I am trying to understand if the ordering of columns matters in QR decompsoition.
In general it seems that column ordering won't matter. I guess for SVD or any matrix factorization the way columns and rows are ordered has no effect, i.e. we can jumble up the columns and rows entirely in linear algebra and it wont matter to algorithms.
Am I corrrect? Do orderings matter in terms of final results or in terms of the intermediate solutions or approximate solutions? Any relevant literature?
Cheers!
If you're asking whether it matters which column you start with initially like choosing $a_{1}, a_{2}, \cdots a_{n}$ the end error will likely be the same the intermediate error may be different. Trefethan and Bau is good. It is also a function of which Gram-Schmidt you're using.
If you're producing $A = \hat{Q}\hat{R}$ and you take
$$ \| A - \hat{Q}\hat{R}\| $$
right it may be different along the way...It is going to produce an orthogonal set of vectors. However, it would be different because not all vectors are the same.