Least squares regression (LSR)

62 Views Asked by At

I have the following Least square regression problem:

$\underset{rank(X)\leq k}{\arg\min} \lVert DX - L \rVert_F^2$

Let's suppose I compute a QR decomposition of $D$ as $D = Q_DR_D$ and solve the equivalent problem:

$\underset{rank(X)\leq k}{\arg\min} \lVert (Q_DR_D)X - L\rVert_F^2$

Now, the solution of this problem is:

$((Q_DR_D)^\dagger L)_k = (R_D^\dagger Q_D^TL)_k$

where $(M)_k$ is the best rank-$k$ approximation of a matrix $M$ and the equality is given by the properties of the Moore-Penrose inverse ($\dagger$) and by the fact that $Q_D^\dagger=Q_D^T$ since it has orthonormal columns.

However, while reading a paper, I stumbled into this further equality:

$(R_D^\dagger Q_D^TL)_k = R_D^\dagger(Q_D^TL)_k$

which I cannot understand. Can anybody please help me to understand that?