Is this illustration of Gauss Newton wrong?

745 Views Asked by At

In this illustration the value of each iteration is the minimum of the 2nd derivative.

Animated Gauss-Newton

But the Wikipedia page says:

the advantage [of the Gauss–Newton algorithm] is second derivatives, which can be challenging to compute, are not required.

Question:

Is the illustration wrong? If yes what algorithm is used?

1

There are 1 best solutions below

3
On BEST ANSWER

In the illustration the second order approximation to the "original" curve and not the second derivative is shown.

Relevant to this question is probably the difference between Newton-Raphson and Gauss-Newton explained here.

If the goal was indeed to find the minimum of the second order function, knowing that it has only one extrema - the minimum, only the first derivative would be required, since that derivative has only one zero.

Also note that the y-axis says $\chi^2$, therefore its safe to assume that it is the minimum of the sum of squares which is approximated (at least indirect over the optimization of parameter a).

The chapter Derivation from Newton's method of the wikipedia article (which is more from the perspective of fitting and therefore an overdetermined system) explains the reason, why you do not need the second derivative after the line "The Gauss–Newton method is obtained by ignoring the second-order derivative terms".

Both the wikipedia article and the illustration are right. Also there is no conflicting information.