Preconditioning the Damped Normal Equations - Reference

178 Views Asked by At

For some people working with numerical approximations, it is very common to be dealing with a problem in the form $$\min_x \|Ax - b\|,$$ where $A$ is a matrix and $x,b$ are vectors. Consider that $A$ is big and sparse. In this case it is also very common to use an iterative method with aid of a preconditioner, i.e., a matrix $M$ such that $M^{-1}A$ has a small condition number. Then we solve the problem $$\min_x \|M^{-1}Ax - M^{-1}b\|.$$

Since $M^{-1}A$ has small condition number, the iterative solver will (we hope) to solve the same problem with fewer iterations than before. This is the purpose of preconditioning.

It is also common to regularize the problem. I'll consider only the Tikhonov regularization, which changes the original problem to the following: $$\min_x \|Ax - b\| + \lambda \|x\|,$$ where $\lambda \geq 0$ is the damping parameter.

The solution of this problem is given by the normal equations $$(A^TA + \lambda I)x = A^Tb.$$

The advantage of this formulation is that $A^TA + \lambda I$ is a SPD matrix. In the case $A$ is ill-conditioned we may want to solve the normal equations instead of the original minimization problem. Also, if $A$ is very tall, then $A^TA$ will be square with lower dimension, this could also help to improve the speed of the computations. The bad news is that $A^TA + \lambda I$ squares the condition number, so the solver will probably converge very slowly, and we gain nothing.

The solution to this is to precondition $A^TA + \lambda I$ with some matrix $M$ such that $M^{-1} (A^TA + \lambda I)$ has a small condition number. Now we are joining together two common worlds: the preconditioning and the regularization worlds. Since both of them arise too much in practice, I wonder if someone already tried to precondition the damped normal equations formulation. I tried to find some article about it, but without success. I feel it must to have something 'out there'. Probably some of you know something.

Thank you for you attention.