Necessary/sufficient condition for directional derivative to be zero (infinite-dimensional optimization)

141 Views Asked by At

Let $E$ be a function space, and $f : E \rightarrow \mathbb{R}$ be a directionally-differentiable functional.

For a closed convex subset $S \subseteq E$, define $x^*$ to be a minimizer of $f$, $$ x^* \in \operatorname{argmin}_{ x \in S} f(x) $$

Assuming such an $x^*$ exists, the directional derivative of $f$ at $x^*$ toward any other $y \in S$ is non-negative, $$ D_{y - x^*} f(x^*) \ge 0 $$ (otherwise, $f$ could be decreased by moving slightly from $x^*$ to $y$, and thus couldn't be the minimizer.).

I am interested in the necessary/sufficient conditions on $x^*$ that make the above inequality an equality, i.e., $D_{y - x} f(x) = 0$. For finite-dimensional spaces (e.g., $E=\mathbb{R}^n$), it is sufficient that $x^*$ is in the relative interior of $S$. However, this question appears to become more complicated in infinite-dimensions.

What is the necessary and/or sufficient condition on $x^*$ that guarantees that $$ D_{y - x^*} f(x^*) = 0 $$ for every $y\in S$?