Let $A\in\mathbb{R}^{m\times n}$, the QR decomposition : $$ A=QR $$ produces an orthonormal matrix $Q\in\mathbb{R}^{m\times m}$ and an upper triangular matrix $R\in\mathbb{R}^{m\times n}$.
In the case $m>n$, an Economy QR is derived by cutting the last $m-n$ columns the matrix $Q$ and for the matrix $R$, the last $m-n$ rows are basically $0$s so they will be removed as well and this produce the economy QR $$A=Q_{1}R_{1}$$ where $Q_{1}\in\mathbb{R^{m\times n}}$ and $R_{1}\in\mathbb{R^{n\times n}}$
My question is by removing the non-zero $m-n$ columns of $Q$ in this procedure aren't we removing some valuable data? Suppose $A$ is of full rank and I am working on a regression problem if $Q$ will hold valuable data features should I prioritize some of these features by multiplying $A$ (which would be holding the features) with a permutation matrix $P$ producing $PA=QR=Q_{1}R_{1}$? This would (correct me if I am wrong) interchange column $i$ with column $j$ of Q.
Here is a direct algebraic proof:
'Full' QR factorization
$A = Q'R'$
where $A \in \mathbb R^{m\times n}$ is tall and skinny and is injective (full rank).
$Q'$ is orthogonal (and $m\times m$) but $R'$ is $m\times n$
implies 'thin' QR factorization
$A=QR$
$Q':=\bigg[\begin{array}{c|c|c|c|c} \mathbf q_1 &\cdots & \mathbf q_{n} & \mathbf q_{n+1} &\cdots & \mathbf q_{m}\end{array}\bigg]$
$Q:=\bigg[\begin{array}{c|c|c} \mathbf q_1 & \cdots & \mathbf q_{n}\end{array}\bigg]$
$R'=\begin{bmatrix}R \\ \mathbf 0\end{bmatrix}=\begin{bmatrix}\mathbf r_1^T\\ \vdots \\ \mathbf r_n^T \\ \mathbf 0^T \\ \vdots \\ \mathbf 0^T\end{bmatrix}$
where we know everything below $R$ is zero because $R'$ is 'upper triangular' (though tall and skinny)
Finally, using the 'outer product' interpretation of matrix multiplication this gives
$A=Q'R' = \big(\mathbf q_1\mathbf r_1^T + \cdots +\mathbf q_n\mathbf r_n^T\big) + \big(\mathbf q_{n+1}\mathbf 0^T+ \cdots +\mathbf q_{m}\mathbf 0^T\big)=\big(\mathbf q_1\mathbf r_1^T + \cdots +\mathbf q_n\mathbf r_n^T\big) = QR$