Suppose that a real valued function $f$ is continuous at $x_0$. Then, the epsilon delta definition says that for any $\epsilon > 0$, there is a $\delta > 0 $ such that $|f(x_0)-f(x)| < \epsilon $ whenver $|x-x_0| < \delta$.
Based on this definition, how do I show that $f(x_0+h)-f(x_0) = O(h)$? That is, $|f(x_0+h)-f(x_0)| \leq C h$ for some constant $C$ and all $h$ such that $h \leq h_0$ for some $h_0$?
$f$ needs not be differentiable, so I cannot really use the Taylor's theorem here.
The statement is not true. Take for example the function $\sqrt{x}$ on $[0,\infty)$. It is continuous at $0$ but $$|\sqrt{0 + h} -\sqrt{0} | = \sqrt{h},$$ which can't be bounded by $C\,h$ for $h\leq h_0$ for any $C$ and $h_0$.