Can we always find a proper step size?

42 Views Asked by At

In convex optimization, if we know the gradient of a function $f(x)$, then is it true that we could always find a way to determine a proper step size in the gradient descent method? When I say "proper" I mean it only need to be proper enough to make the funciton $f$ converge to the optimum.