If $f(x)\geq 0$ for $a \lt x \lt x_0$ and $\displaystyle\lim_{x \to {x_0}^-} f(x)$ exists, then $\displaystyle\lim_{x \to {x_0}^-} f(x)\geq 0$.
I have issue for $f(x)$s are in $(L, L + \epsilon)$ when assume $\displaystyle\lim_{x \to {x_0}^-} f(x)=L$. Actually, I cannot guarantee that at least one $f(x)$ is in the interval $(L - \epsilon, L)$.
With these kind of questions I usually try a proof by contradiction.
Let $L=\lim_{x\to x_0^-}f(x)$ and assume that $L<0$. Then, we can write $L=-\alpha$ with $\alpha>0$. Let $\epsilon=\alpha$. By the $\epsilon-\delta$ definition for limits, we know that there exists some $\delta>0$ such that $x\in(x_0-\delta,x_0)$ implies $|f(x)-L|<\alpha$. Per definition of the absolute value, this implies that $$ f(x)-\alpha<L<f(x)+\alpha. $$ So, we see that this gives $$ f(x)<L+\alpha=-\alpha+\alpha=0, $$ which is in contradiction with the fact that $f(x)\geq0$ for all $x\in(a,x_0)$.
We conclude that our assumption that $L<0$ must have been wrong, and therefore that $\lim_{x\to x_0^-}f(x)\geq0$.