I am reading the paper of Prof. Yurii Nesterov:
Primal-dual subgradient methods for convex problems
The following part confuses me:
$\\$
$\\$
- ${\color{red}{E}}\ $ ${\color{red}{\text{is a finite-dimensional real vector space.}}}$
- ${\color{red}{d(x)}}\ $ ${\color{red}{\text{is strongly convex with a convexity parameter}\ \ \sigma \ \ \text{on}\ \ Q}}$
- ${\color{red}{d(x_0)=0}}\ $, which is the minimal point.
My question is the last sentence. The first function is like the dual norm; since dual norm is a norm, which is nonnegative.
How to show the second function is nonnegative? and also strongly convex with a convexity parameter $\sigma \beta$?
Note: The second question comes from his another paper with part of it:
Edit: @copper.hat pointed out there's a much simpler solution. Just note that \begin{align*} V_\beta(s) = \max_x \, \langle s, x - x_0 \rangle - \beta d(x) \geq \langle s, x_0 - x_0 \rangle - \beta d(x_0) = 0. \end{align*}
Previous answer:
I'll assume that $\beta = 1$ for simplicity. Then $V_\beta(s) = -\langle s, x_0 \rangle + d^*(s)$. By the Fenchel inequality, $\langle s, x_0 \rangle \leq d(x_0) + d^*(s)$. Assuming that $d(x_0) = 0$, we see that $0 \leq - \langle x, x_0 \rangle + d^*(s)$. So $V_\beta(s) \geq 0$.