A well-known fact in optimization theory is the following:
Let $C_{ad}$ be a non-empty, convex subspace of a real Banach space $B$ and let $F: U \mapsto \mathbb{R}$ be a function defined on an open set $U$ containing $C_{ad}$. If $F$ is Gateaux differentiable (or at least has all directional derivatives) in $U$ and $\overline{u} \in C_{ad}$ is such that $$ \overline{u} = \operatorname{argmin}_{u \in C_{ad}} F(u), $$ then $$ F'(\overline{u})(u - \overline{u}) \ge 0 \qquad \forall u \in C_{ad}. $$
I have some troubles deeply understanding the so-called variational inequality, in particular I cannot find an easy but significant example in which such inequality is strict (at least for some $u$).
For instance, if $B=\mathbb{R}^n$ all (existing) directional derivatives must obviously vanish. Is this something related to finite/infinite dimension?
I will be grateful if someone could provide such an example and perhaps an insight on that famous variational inequality.
I am suggesting the following example. Take the first quadrant as $C_{ad}$, i.e., $x \geq 0$, and $y \geq 0$. Take for objective function $F(x,y) = (x+1)^2 + (y-1)^2$. The minimum is reached at $(0,1)$. At this point the gradient is $(2,0)$. Any direction at the minimum can be written as $(x, y-1)$. The scalar product between these two vectors $2x > 0$ for $x >0.$
So this is not related to finite/infinite dimension.