The description of gradient descent in Wikipedia says: $$x_{n+1} = x_n - \gamma_n\nabla F(x_n)$$ for $n = 0,1,2,...$
Suppose that $x_n$ converges to $x$.
Then, is it always true that $\nabla F(x) = 0$?
Can $x$ be a saddle point of $F$?
The description of gradient descent in Wikipedia says: $$x_{n+1} = x_n - \gamma_n\nabla F(x_n)$$ for $n = 0,1,2,...$
Suppose that $x_n$ converges to $x$.
Then, is it always true that $\nabla F(x) = 0$?
Can $x$ be a saddle point of $F$?
Copyright © 2021 JogjaFile Inc.
If $x_n$ converges to $x^*$, then the gradient at that location is zero ($\nabla F(x^*) = 0$): $$x^* = x^* - \gamma_n \nabla F(x^*) $$ $$ \nabla F(x^*) = 0$$
If the function $F(x)$ is convex, then $x^*$ is guaranteed to be a minimizer of $F(x)$. However, in general, the point $x^*$ may be a saddle point of the function.