Gauss-Newton convergence for constant Hessian

223 Views Asked by At

If I use Gauss-Newton to solve a least square optimization problem and $\mathbf{J}^H\mathbf{J}$ is constant does it imply that I will reach the solution in one iteration?

1

There are 1 best solutions below

2
On BEST ANSWER

The answer is no. The reason for that is that the matrix $J^HJ$ is not the Hessian, but an approximation of it. Look at the following one-dimensional example: given constant $y$, solve $\min f(x)$ where $$ f(x)=\left\|\left[\matrix{y_1\\y_2}\right]-\left[\matrix{\cos(x)\\ \sin(x)}\right]\right\|^2. $$ Here $$ J=\left[\matrix{\sin(x)\\ -\cos(x)}\right],\qquad J^HJ=1 $$ but $f$ is not a quadratic function ($f''$ is not constant), so Newton does not converge in one step.