If I use Gauss-Newton to solve a least square optimization problem and $\mathbf{J}^H\mathbf{J}$ is constant does it imply that I will reach the solution in one iteration?
2026-04-24 00:54:44.1776992084
Gauss-Newton convergence for constant Hessian
223 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
The answer is no. The reason for that is that the matrix $J^HJ$ is not the Hessian, but an approximation of it. Look at the following one-dimensional example: given constant $y$, solve $\min f(x)$ where $$ f(x)=\left\|\left[\matrix{y_1\\y_2}\right]-\left[\matrix{\cos(x)\\ \sin(x)}\right]\right\|^2. $$ Here $$ J=\left[\matrix{\sin(x)\\ -\cos(x)}\right],\qquad J^HJ=1 $$ but $f$ is not a quadratic function ($f''$ is not constant), so Newton does not converge in one step.