Prove that if $f$ is a convex differentiable function with at least one root, then Newton's method converges for any $x_0$ with $f′(x_0)≠0$
I saw this theorem on stack exchange, and it seems like one of the most useful theorems ive seen related to Newton's method. However, I cant figure out a proof. I was wondering if anyone could help.
Hint: let $a$ denote a root to which we seek convergence and define $\epsilon_n:=x_n-a$ so $$\epsilon_{n+1}=\epsilon_n-\frac{f\left( a+\epsilon_n\right)}{f'\left( a+\epsilon_n\right)}\approx\epsilon_n\left( 1-\frac{f'\left( a\right)+\frac{1}{2}\epsilon_nf''\left( a\right)}{f'\left( a\right)+\epsilon_nf''\left( a\right)}\right)\approx\frac{f''\left( a\right)}{2f'\left( a\right)}\epsilon_n^2.$$What conditions imply $\lim_{n\to\infty}\epsilon_n =0$?