Tricks for speeding up Conjugate Gradient on Normal Equations?

205 Views Asked by At

I have been using Krylov subspace methods for a long time in my research.

I know from general theory of Krylov subspace methods that for solving a problem:

$$\bf Ax = b$$

if $\bf A$ is square and positive definite for example symmetric, then we can get away with $\dim({\bf x})$ iterations of Conjugate Gradient (C-G).

For a particular problem I am solving recently, because of lack of mentioned properties for $\bf A$ I build the following Normal-equations:

$${\bf A}^T{\bf Ax = A}^T {\bf b}$$

Where we will be sure that $({\bf A}^T{\bf A})$ will be positive semi definite. Maybe for stability we can add Tykhonov regularizing term : $+\lambda \bf I$:

$$({\bf A}^T{\bf A}+\lambda{\bf I}){\bf x} = {\bf A}^T {\bf b}$$

Then it seems I get good solution only for iteration count of $\geq 2\dim({\bf x})$.

My questions are two-fold:

  1. Can we prove this, and
  2. Does there exist any methods to avoid this doubling in number of iterations?

My own work is mostly restricted to speculation about some clever choosing of $\lambda$ for regularization or perhaps if we can find some suitable preconditioner?