In an old exam there is the following question
Let $f: \mathbb{R}\rightarrow\mathbb{R}$ be two times continuous differentiable with $f(x^*)=0$ and $f'(x^*)\not= 0$. Prove that $x_{k+1}=x_k- \frac{f(x_k)}{f'(x_0)}$ converges linear to $x^*$ if $x$ is close enough to $x^*$.
I have tried it using fixpoint iteration. Let $\phi(x_{k})=x_k-\frac{f(x_k)}{f'(x_0)}$, then $x^*$ is a fixpoint of $\phi$ because $$x^*=\lim\limits_{k\rightarrow \infty}x_{k+1}=\lim\limits_{k\rightarrow \infty}\phi(x_k)=\phi(\lim\limits_{k\rightarrow \infty}x_{k})=\phi(x^*).$$ When I make use of the Mean Value Theorem I come to: $$ | x_{k+1}-x^*|=|\phi(x_k)- \phi(x^*)|=\phi'(\xi)|x_k - x^*|, \quad \text{with $\xi$ between $x_k$ and $x^*$} $$ Wouldn't $L=\phi'(\xi)$ be enough to prove the linear convergences? The problem I have is that I don't know whether that is the correct way or false, because I don't use the "two times differentiable" anywhere for example. I figure maybe Taylor expansion would be another possible way but I did not try it.