Assume $f: [a, b] \rightarrow \mathbb{R}$ is a continuous function on the closed interval $[a,b]$ where $a, b \in \mathbb{R}$ and that $f$ is differentiable on its domain.
Then by the Mean Value Theorem it holds true that: $$f(c + d) = f(c) + d \times f'(e)$$Where $e \in (c, c+d)$. Now assume $f: [a, b] \rightarrow \mathbb{R}$ is a continuous function on the closed interval $[a,b]$ where $a, b \in \mathbb{R}$ and that $f$ has left and right derivative everywhere, where the two need not be equal and one of them may be $\pm \infty$.
Then the following inequality holds true: $$ f(c + d) \geq f(c) + d \times \min\{f'(e-), f'(e+)\} $$
Where again $e \in (c, c+d)$. I understand that by the generalization of the IVT to the case where continuity is guaranteed on an open interval (see for instance this post for more details) we have $\min\{f'(e-), f'(e+)\}$ in this case instead of $f'(e)$ above, but where does the inequality come from?