Strong Convexity Inequality with Minimizer

42 Views Asked by At

I was reading "Primal-dual subgradient methods for convex problems" by Nesterov, and in the appendix, he proves that if $d(x)$ is $\sigma$-strongly convex, then it has a minimizer $x'$, and it satisfies $$d(x)\geq d(x')+\frac{\sigma}{2}\| x-x' \|^2.$$

To prove this inequality, he uses the definition of strong convexity that for all $\alpha \in [0,1]$, we have $$ \begin{align} &\alpha d(x)+(1-\alpha)d(x') \\ &\geq d(\alpha x+(1-\alpha)x')+\frac{1}{2}\sigma\alpha(1-\alpha)\| x-x' \|^2 \\ &\geq d(x')+\frac{1}{2}\sigma\alpha(1-\alpha)\| x-x' \|^2. \end{align}$$

However, he then proceeds by claiming that

$$d(x)\geq d(x')+\frac{1}{2}\sigma(1-\alpha)\| x-x' \|^2,$$

and letting $\alpha \to 0$ yields the result. My question is: why does this claimed inequality follow, and how do we obtain the desired inequality from here?

1

There are 1 best solutions below

0
On

The first and last line of the inequality chain you stated is

$$\alpha d(x)+(1-\alpha)d(x') \geq d(x')+\frac{1}{2}\sigma\alpha(1-\alpha)\| x-x' \|^2.$$

Subtract $d(x')$ from both sides to get

$$\alpha \left(d(x)-d(x')\right) \geq \frac{1}{2}\sigma\alpha(1-\alpha)\| x-x' \|^2.$$

Divide by $\alpha$ to get

$$d(x)-d(x') \geq \frac{1}{2}\sigma(1-\alpha)\| x-x' \|^2,$$

which is your result.