How would you introduce the Levenberg-Marquardt algorithm:
- To someone who understand the concept of minimisation and derivative.
- By using intuition instead of equation if possible.
For instance a way to explain Newton, Gauss-Newton or Gradient-descent algorithms is to use such illustrations:
Animated illustration of the Newton algorithm
Next iteration where the first derivative = 0.
Animated illustration of the Gauss-Newton algorithm
Next iteration where the second derivative is minimal.
[Animated illustration of the Gradient algorithm: i.stack.imgur.com/X93yc.gif][3]
Step size increases if direction stays the same and is dropped when the gradient has changed its direction
Is there any equivalent illustration for the Levenberg-Marquardt algorithm?
An intuitive animation of gradient descent (GD) is pretty easy to come up with. But I think a better intuitive explanation of Gauss-Newton (GN) when dimensions are higher is this: Consider gradient descent, in each step you move in the opposite direction of the gradient by a step of a given size. In GN (assuming you are near the solution), you also take a step in the opposite direction of the gradient, but you distort your step to match the curvature of the surface*.
The Levenberg-Marquardt algorithm, just as @Nir Regev said, is an attempt at oscillating between GN (when we are near a solution, and seek to refine it) and GD (When we are far from an optimium, and need to make large jumps on the surface).
*: (this could be seen as the result of the relationship between the hessian and covariance matrices)