My lecture notes give the following derivation of the Newton-Raphson method for estimating solutions of the equation $f(x)=0$:
Let $f \in C^2 [a,b]$ and assume that $f$ has a root $x^*\in [a,b]$. Let $x_n$ be the $n$th approximation of $x^*$ such that $|x_n-x^*|$ is small.
Then $$0=f(x^*)=f(x_n)+f'(x_n)(x^*-x_n)+\frac{1}{2}f''(\xi(x_n))(x^*-x_n)^2,$$ for some $\xi(x_n)$ between $x^*$ and $x^n$.
Since $|x^*-x_n|$ is small we have that $$0\approx f(x_n)+f'(x_n)(x^*-x_n),$$ and this can then be rearranged to give Newton's method.
Question
My question is simply one of why we need $f\in C^2[a,b]$. That is, why does $f''$ need to be continuous? This seems like more than we need.
By Taylor's theorem (with Lagrange form of the remainder), if $f$ is twice differentiable on $(a,b),$ $f'$ is continuous on $[a,b]$ and $x^*\in [a,b]$ is a root of $f,$ then for any $x_n \in [a,b]$ we have
$$ 0=f(x^*)=f(x_n)+f'(x_n)(x^*-x_n)+\frac{1}{2}f''(\xi_n)(x^*-x_n)^2 $$
for some $\xi_n$ between $x_n$ and $x^*.$ Up to this point, we have not required that $f''$ is continuous on $[a,b].$
However, for the purposes of Newton's method, we usually assume further that $f''$ is continuous on $[a,b]$ because if we do not have a condition which bounds the size of $f''$, Newton's method may not converge.
Letting $\epsilon_n = x_n - x^*$ and using $x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)},$ the above equation rearranges to
$$ \epsilon_{n+1} = \frac{ f''(\xi_n) }{2 f'(x_n) } \epsilon_n^2 $$
If we don't have an assumption which can control the size of $f''(\xi_n)$ then we can not hope that a small $\epsilon_n$ leads to a smaller $\epsilon_{n+1}.$ But if we can say that $|f''(\xi_n)| < L$ then we have $\left| \frac{\epsilon_{n+1}}{\epsilon_n} \right|< 1$ if $$ | \epsilon_n | < \frac{2}{L} | f'(x_n)|.$$
Thus, if your initial guess $x_0$ is within $\frac{2}{L}|f'(x_0)|$ of a root, then the sequence of approximations produced by Newton's method will converge to that root.