Let S be a nonempty, bounded convex set in $\mathbb{R}^n$, and let $f: \mathbb{R}^n \to \mathbb{R} $ be defined as follows: $$ f(y) = \sup \{y^T x: x\in S\}. $$ The function $f$ is called the support function of S. Prove that $f$ is convex. Also, show that if $f(y) =y^T x^-,$ where $x^- \in S, \ x^-$ is a subgradient of $f$ at y.
This problem comes from “Nonlinear Programming Theory and Algorithms”.
You need neither boundedness nor convexity for the claims to hold!
$$ \begin{split} f(tx+(1-t)y) &:= \sup_{z \in S}(tx+(1-t)y)^Tz\\ &= \sup_{z \in S}tx^Tz+(1-t)y^Tz\\ &\le \sup_{z \in S}tx^Tz + \sup_{z \in S}(1-t)y^Tz\\ &\le t\sup_{z \in S}x^Tz + (1-t)\sup_{z \in S}y^Tz\\ &=: tf(x) + (1-t)f(y) \end{split} $$
Thus $f$, is convex (and this doesn't depend on $S$ is convex / bounded or not!).
Indeed, for any $x' \in \mathbb R^n$, one computes
$$ f(x) + z^T(x'-x) = f(x) + - z^Tx + z^Tx' = 0 + z^Tx' = z^Tx' \le f(x') := \sup_{s \in S}s^Tx'. $$
Thus $z$ is a subgradient of $f$ at $x$, and once again, this doesn't depend on whether $S$ is convex / bounded or not!