I'm trying to prove the monotonicity of the gradient is equivalent to strong convexity, i.e., for $x\in \mathbb R^d$ and $F:\mathbb R^d\to \mathbb R$.
$$ D^2F(x) \ge c_0\text{Id} \Longleftrightarrow \langle DF(x)-DF(y),x-y\rangle \ge c_0|x-y|^2 \qquad \forall y\in \mathbb R^d$$ where $\langle\cdot,\cdot\rangle$ is the Euclidean dot product and $|\cdot|$ is the Euclidean norm.
I've show the $\Rightarrow$ direction using Taylor expansion, but I'm stuck on the opposite direction.
For dimension 1, I know I end up with
$$ \frac{F'(y)-F'(x)}{y-x}\ge c_0 \qquad x\neq y$$ and take the limit as $y\to x$ (assuming $F''(x)$ exists), but I'm not quite sure what to do for higher dimension. I was told that the trick is to Taylor expand the dot product itself, so I tried to do the following:
Start with $$ \langle DF(x)-DF(y),x-y)\rangle = \langle D^2F(\xi)(x-y),x-y\rangle \tag{Taylor expand about $y$}$$ which results in $$(x-y)^TD^2(\xi)(x-y) \ge c_0|x-y|^2$$ which certainly looks kind of like the strict positive definiteness of the Hessian, but of course, I need to be evaluating at $x$ instead of the $\xi$. Also, I somehow need the $x$ lingering around to go away too, which doesn't quite make sense to me.
Take $d\in \mathbb R^d$, set $y:=x+td$, divide by $t^2$, send $t\searrow0$. Then $\xi$ (as a function of $t$) converges to $x$, and you get the desired inequality.