How to calculate convergence rate of gradient descent

51 Views Asked by At

I am researching on gradient descent. I am looking at the convex case with Lipschitz-continous gradients. For that I'm using Nesterov's "Lectures on convex optimitzation". His result for the convex case is

$ f(x_k) - f^* \leq \frac{2L \|x_0-x^*\|^2}{k+4} $

and for the strongly convex case

$f(x_k) -f^* \leq \frac{L}{2} (\frac{L-\mu}{L+\mu})^{2k}\|x_0-x^*\|^2 $,

where L is the Lipschitz constant and $\mu$ is the factor for strong convexity.

My question is how to calculate the convergence rate from these results? For the first one I found the rate as $\mathcal{O}(\frac{1}{k})$ in another paper. How would I calculate the result for the strongly convex case?

All I found about convergence rates (on wikipedia etc) is estimates including the differences in x-values, not with the actual function value.

Thanks in advance.