I have seen in a few online lectures the following pattern:
We have an inequality, e.g: $e^x \gt 1 + x$ for $ x > 0 $, and we try to prove this inequality with the mean value theorem using the interval $[0,x]$.
Why is this possible if the possible interval for the inequality is $(0,x]$?
Pay close attention to the formulation of the mean value theorem; when it's applied to a function satisfying the correct assumptions on the closed interval $[a,b]$, it gives you a point $\xi$ in the open interval $(a,b)$ such that $f'(\xi)=\cdots$.