Let $f:X \to \mathbb{R}$ be a function such that $\partial f(x)\neq \emptyset$ for all $x \in X$. Show that $f$ is convex.
I would appreciate some help with getting started on this problem. Thanks
Let $f:X \to \mathbb{R}$ be a function such that $\partial f(x)\neq \emptyset$ for all $x \in X$. Show that $f$ is convex.
I would appreciate some help with getting started on this problem. Thanks
On
Let $(x_1, x_2, \lambda) \in X^2 \times [0, 1]$, and set $x = \lambda x_1 + (1 - \lambda) x_2$. We wish to show that $\lambda f(x_1) + (1 - \lambda) f(x_2) \ge f(x)$.
The hypothesis implies that $\partial f(x) := \{v \in X \text{ s.t } f(z) \ge f(x) + \langle v, z - x\rangle\text{ }\forall z \in X\} \ne \emptyset$. Now choose any subgradient $v \in \partial f(x)$. One has \begin{equation} f(x_1) \ge f(x) + \langle v, x_1 - x\rangle = f(x) + (1 - \lambda)\langle v, x_1 - x_2\rangle, \end{equation} and \begin{equation} f(x_2) \ge f(x) + \langle v, x_1 - x\rangle = f(x) - \lambda\langle v, x_1 - x_2\rangle. \end{equation} Multiplying the first inequality by $\lambda$ and the second by $(1- \lambda)$ and adding the resulting inequalities then yields $\lambda f(x_1) + (1 - \lambda) f(x_2) \ge f(x) + 0 = f(x)$, which is the required result.
Hint: Consider $x_1, x_2\in X$, and $x=\lambda x_1+(1-\lambda)x_2$, for $\lambda\in(0,1)$. What can you say about $f(x_1)-f(x)$?. Or about $f(x_2)-f(x)$?. What does this imply about $\lambda f(x_1)+(1-\lambda)f(x_2)-f(x)$?