I am learning Newthon's Method to Nummericaly find roots.
The book (Sauer, Numerical Methods) proves that it has a quadratic convergence for simple roots by rearranging the Taylor series for the function $f$ $$f(r)=f(x_i)+(r-x_i)f'(x_i)+\frac{(r-x_i)^2}{2}f''(c_i)$$ to $$x_i-\frac{f(x_i)}{f'(x_i)}=r+\frac{(r-x_i)^2}{2} \frac{f''(c_i)}{f'(c_i)}$$ $$\rightarrow x_{i+1}-r=\frac{(r-x_i)^2}{2} \frac{f''(c_i)}{f'(c_i)}$$ $$\Rightarrow e_{i+1}=e_i^2 |\frac{f''(c_i)}{2f'(c_i)}|$$
When $f'(r)=0$ the root $r$ has a multiplicity>1 and hence the expression is invalid because of division with zero.
The book now goes on to calculate convergence of $f(x)=x^m$ which becomes $$e_{i+1}=e_i\frac{m-1}{m}$$ This now leads to the theorem:
Assume that the (m+1)-times continuously differentiable function $f$ on [a,b] has a multiplicity $m$ root at $r$. Then Newton's Method is locally convergent to $r$, and the error $e_i$ at step $i$ satisfies $$\lim\limits_{i \to \infty} \frac{e_{i+1}}{e_i}=S$$ where $$S=\frac{m-1}{m}$$
How do you prove this for any function? My guess is that you express the function $f(x)=(x-r)^mg(x)$ where $m$ is the multiplicity, but I can't exactly se how.