If $f:\mathbb{R} \to \mathbb{R}$ is differentiable with at least two roots, I wish to show that Newton's method will not converge for some $x_0$.
I know that $f'(x)$ has a zero, say at $z$. It seems we should choose $x_0$ close to $z$ to ensure that the Newton iterates wander away. But it's hard to say anything more precise without knowing more about $f(x)$...
If $f'(x_0) = 0$, there is no $x_1$.
In the case of a quadratic $f$ with two real roots, that is the only initial point where Newton goes wrong: for all other $x_0$, it converges to one of the roots.
EDIT: Somewhat more generally, if $f$ is a convex differentiable function with at least one root, you have convergence to a root from any $x_0$ with $f'(x_0) \ne 0$.