Prove gradient descent converges to a local minimum

170 Views Asked by At

I would like to have a proof that indeed, the gradient descent converges to a local minimum if it exists, for a differentiable function with Lipschitz continuous derivative $f:\mathbb R \rightarrow \mathbb R$. Say also that the function is bounded from below. One dimension is enough for the purposes of the question.

The gradient descent algorithm is given by, with some initial condition $y_0$ $$y_{n+1}=y_n-g(n)f'(y_n)$$

It's quite clear that $g(n)$ should be chosen so that $g(n)\rightarrow0$ and $\sum g(n)$ diverges. Otherwise we may have a situation in which the algorithm jumps around the minimum or gets stuck. A natural choice is $g(n)=\frac{1}{n}$. How to prove the result - and is there a constructive proof?