Let $\Omega\subset \mathbb{R}^n$ be a closed convex set and a funtion $f:\mathbb{R}^n\rightarrow \mathbb{R}$.
I read that the condition $$ u \in \arg \max_{v\in \Omega} \; f(v) $$ can be weakened as $$ \langle \frac{\partial f}{\partial v}(u), y-u \rangle \le 0,\quad \text{for all } y\in \Omega $$
If $u\in \overset{\circ}{\Omega}$, then the conditions are equivalent ?
Also the result comes from the supporting theorem, nothing else ?
Presumably $f$ is differentiable.
Suppose $\langle \nabla f(u), y-u\rangle > 0$ for some $y \in \Omega$. This inner product is the directional derivative $\lim_{t \downarrow 0} \frac{f(u+t(y-u))-f(u)}{t}$, so for some small $t$ we have $f(u+t(y-u)) > f(u)$, and thus $u$ is not a maximizer for $f$ in $\Omega$. Phrased differently, if $u$ is a maximizer for $f$ in $\Omega$, then $\langle \nabla f(u), y-u\rangle \le 0$ for all $y \in \Omega$.
If we further assume $f$ is concave and twice differentiable, then the converse holds as well. Suppose $\langle \nabla f(u), y-u\rangle \le 0$ for all $y \in \Omega$. Then for any $z \in \Omega$, $$f(y) = f(u) + \underbrace{\langle \nabla f(u), y-u\rangle}_{\le 0} + \frac{1}{2} \underbrace{\langle\nabla^2f(\xi) (y-u), y-u\rangle}_{\le 0} \le f(u),$$ where $\xi = u+t(y-u)$ for some $t \in (0,1)$. The gradient term is nonpositive by our assumption, and the Hessian term is nonpositive because concavity of $f$ implies the Hessian is negative semi-definite. In summary, if $f$ is concave and $\langle \nabla f(u), y-u\rangle \le 0$ for all $y \in \Omega$, then $u$ is a maximizer for $f$ in $\Omega$.
Response to comment:
$\langle \nabla f(u), y-u\rangle$ is the directional derivative in the direction from $u$ to $y$. Taking $t$ from $1$ to $0$, the point $u+t(y-u)$ varies from $y$ to $u$, and by convexity of $\Omega$ these points all lie in $\Omega$. When I write $\lim_{t \downarrow 0} \frac{f(u+t(y-u))-f(u)}{t}$, I am taking $t \to 0$ from above, so $t$ is always positive.
If you take $t < 0$, there is no guarantee that $u+t(y-u)$ lies in $\Omega$ anymore. But if it does, you can just consider a different "$y$" to handle this "negative" direction.