Consider the everywhere twice differentiable function $f:\mathbb R^n\to \mathbb R$, the closed and convex set $\mathcal S$, and the convex optimization problem
$$ \min_{x\in \mathcal S} \; f(x). $$
Is there an easy / intuitive way of proving both statements?
$x = x^*$ is a local minimizer if $\nabla f(x^*) = 0$ and $\nabla^2 f(x^*) \succ 0$. Specifically, the condition $\nabla^2 f(x^*) \succeq 0$ is not sufficient, since as a counterexample we can consider $f(x) = x^3$, where at $x = 0$, $\nabla^2 f(0) = 0$ and $\nabla f(0) = 0$, but $0$ is not a minimum.
$x = x^*$ is a global minimizer if $\nabla f(x^*) = 0$ and $\nabla^2 f(x) \succeq 0$ for all $x\in \mathcal S$.
The second statement in particular is quite well-known in convex optimization literature. However, I wonder if there is a nice proof, to reassure ourselves that there are no corner cases (like the one found in case 1).
Yes to both of the questions (assuming that $\nabla^2 f$ is continuous for the first question).
The result follows from the multivariables Taylor theorem: $$ f(x+v) = f(x) + \nabla f(x)\cdot v + (\nabla^2f(x+\theta v): v\otimes v ) $$ for some $\theta\in(0,1)$. By letting $x=x^*$ this reduces to $$ f(x^*+v) - f(x^*) = (\nabla^2f(x^*+\theta v): v\otimes v ), $$ which obviously implies (2.).
For (1.), the assumption that $\nabla^2 f(x^*) \succ 0$ and continuity of $\nabla^2 f(x^*)$ means that $\nabla^2 f(x^*+\theta v) \succ 0$ for sufficiently small $v$ (see this question), thus the above formula shows that $f(x^*+v) - f(x^*) >0$ in a small neighborhood.