Least Squares and Preconditioners

26 Views Asked by At

I've seen the preconditioned least squares objective function $\arg\min_{x}\Vert P^{-1}Ax-P^{-1}b\Vert_2$, where $P^{-1}$ is positive definite. However, I'm not sure how we show that this is the equivalent to the original $\arg\min_x \Vert Ax-b\Vert_2$. My attempt is that if $\sigma$ is the smallest eigenvalue of $P^{-1}$, then \begin{align} \sigma\Vert Ax-b\Vert_2 \leq \Vert P^{-1}Ax-P^{-1}b\Vert_2 \leq \Vert P^{-1}\Vert \Vert Ax-b\Vert_2 \end{align} and noting that $\arg\min_x \sigma \Vert Ax-b\Vert_2=\arg\min_x \Vert P^{-1}\Vert \Vert Ax-b\Vert_2$, we have \begin{align} \arg\min_x \Vert Ax-b\Vert&=\arg\min_{x}\Vert P^{-1}Ax-P^{-1}b\Vert_2 \end{align} but I'm not sure that this argument is correct.

1

There are 1 best solutions below

1
On

This problem is not equivalent to the original one, essentially now you are minimizing with respect to another inner product, the one given by $\langle x , y\rangle := xP^{-T}P^{-1}y$. Up to an orthogonal transformation this is a weighted least squares for some weights $w_1,\dots, w_n$, hence this is not equivalent to the original minimization (no weights).