I've googled this and I've seen different types of proofs but they all use notations that I don't understand.
But first of all, I need to understand what quadratic convergence means, I read that it has to do with the speed of an algorithm. Is this correct?
Ok, so I know that this is the Newton-Raphson method:
$$x_{n+1}=x_n-\dfrac{f(x_n)}{f'(x_n)}$$
How do I prove that it converges?
Thanks.
You look at the size of the next function value. For simple roots and close to the root, the function value is a measure for the distance to the root. $$ f(x+h)=f(x)+f'(x)h+\frac12 f''(x+\theta h)h^2 $$ Denote $L=\max_{x\in I} |f''(x)|$ and set $f(x)+f'(x)h=0$, then $$ |f(x+h)|\le \frac L2 h^2=\frac L2\frac{f(x)^2}{f'(x)^2} $$ Now put the first derivatives into the constant and return to the iteration sequence $(x_n)$ to get $$ |f(x_{n+1})|\le C\,|f(x_n)|^2 \iff |C\,f(x_{n+1}|\le|C\,f(x_n)|^2 $$ where $C=\frac{L}{2m^2}$ with $$ 0< m\le |f'(x)|\le M<\infty $$
Repeated squaring leads to a dyadic power in the exponent, so that $$ |C\,f(x_n)|\le|C\,f(x_0)|^{2^n} $$ This is what is meant with quadratic convergence, that the exponent is $2^n$ instead of $n$ as in linear convergence.
The condition to guarantee convergence is then $|C\,f(x_0)|<1$.
For the distance to the root $x_*$ use $$ f(x)=f(x)-f(x_*)\le f'(x_*+\theta(x-x_*))\,(x-x_*) $$ so that $$ m\,|x-x_*|\le |f(x)|\le M\,|x-x_*|\iff \frac{|f(x)|}M\le |x-x_*|\le\frac{|f(x)|}m. $$