finding rate of convergence by newton fixed point theorem and the derivative

40 Views Asked by At

Given a differentiable function $f$, we know by Newton fixed point theorem $p_{k+1}=h(p_k)$, we'll define: $$h(x) = x- \frac{f(x)}{f'(x)}$$.

prove that $$h'(x)=\frac{f(x)\cdot f''(x)}{\left(f'(x)\right)^2}$$

Given that $f'(p)\neq 0, f(p)=0$, show that $h'(p)=0$, and conclude the rate of convergence of the sequence $p_k$ to $p$.

My Attempt:

by simple arithmetic actions it's easy to see that: $h'(x)=\frac{f(x)\cdot f''(x)}{\left(f'(x)\right)^2}$, and $h'(p)=0$.

my question is how can I conclude anything about the rate of convergence if everything we have is the definition of rate of convergence, such that $\alpha$ is the rate of convergence if there exists $K>0: \ |p_k - p| \le K \frac{1}{n^{\alpha}}$.