Let $F: \mathbb{R}^n \to (-\infty,\infty]$ be a convex function and let $C \subset \mathbb{R}^n$ be convex. If $x \in C$ is a minimum point for $F$ restricted to $C$ and $\partial F(x) \neq \emptyset$, can we find $v \in \partial F(x)$ such that $$ (v,y - x) \geq 0 $$ for all $y \in C$?
This is certainly the case if $F$ is differentiable, but I need such a result in the non-differentiable case.
Yes. $x \in C$ is optimal if and only if there exists $g \in \partial f(x)$ for which $-g \in \mathcal{N}_C(x)$, where $\mathcal{N}_C (x)$ is the normal cone at $x$, parsing this: $$ (-g)^T x \geq (-g)^T y \rightarrow g^T (y-x) \geq 0 \text{ for all } y \in C.$$
The proof is a simple application of Fermat's optimality condition and the sum rule, we can cast this problem as: $\min_x f(x) + I_C (x)$ where $I_C$ is the indicator on $C$. Invoking subgradient optimality: $$ 0 \in \partial f(x) + \mathcal{N}_C (x) $$ and hence $(- \partial f(x)) \cap \mathcal{N}_C (x) \neq \emptyset$, which is the statement above. This is in Beck 2017, Theorem 3.67.