What is the best known algorithm in terms of convergence rate for unconstrained convex optimization and under what assumptions?
$\min_{x} f(x)$
where $f(x)$ is a given twice differentiable convex function
What is the best known algorithm in terms of convergence rate for unconstrained convex optimization and under what assumptions?
$\min_{x} f(x)$
where $f(x)$ is a given twice differentiable convex function
Copyright © 2021 JogjaFile Inc.
Simply specifying that a function is twice differentiable is not enough to guarantee a complexity rate.
The best theoretical treatment of second-order methods---that is, methods that exploit both first- and second-derivative information---is probably by Yurii Nesterov and Arkadii Nemirovskii. Their work requires an assumption of self-concordance, which in the scalar case is $$|f'''(x)|\leq \alpha (f''(x))^{3/2} \quad \forall x$$ for some fixed constant $\alpha>0$.
For methods that exploit only first-derivative information, a good resource is... Nesterov once again. There too, you need additional information about $f$, such as Lipschitz continuity of the gradient. Again, in the scalar case, this looks like $$|f'(x)-f'(y)| \leq L |x-y|$$ If you also have strong convexity you can get even better performance bounds.
Your best bet to learn more here is Google. The search terms I'd use for the first case is "self-concordant Newton's method", and for the second, "accelerated first-order methods." The general point here is this: to answer the question of complexity, you need two things: the set of calculations you are allowed to perform on the function (e.g., how many derivatives), and a regularity condition of some sort that ensures that those function values don't change too fast.