Consider Newton's Method: begin with a seed $x_0$ and a differentiable function $f$. Then to approximate a root of $f$ we do the following iteration:
$$ x_n = x_{n-1} - \frac{f(x_{n-1})}{f'(x_{n-1})}$$
We hope that $x_n \to r$, where $f(r) = 0$.
This method is quadratically convergent as is. However, an exercise from Cheney and Kincaid's Numerical Mathematics and Computing asks the following:
To avoid computing the derivative at each step in Newton's method, it has been proposed to replace $f'(x_n)$ by $f'(x_0)$. Derive the rate of convergence for this method.
Some tests show that this new method, given by the sequence of iterations
$$ x_n = x_{n-1} - \frac{f(x_{n-1})}{f'(x_{0})}$$
converges approximately linearly. How does one show this?
Attempt.
The text derives the quadratic convergence by using a Taylor expansion. I mimic this and do the following:
Let $e_n = x_n - r$ represent the error at step $n$. Then we may write
$$ e_n = x_{n-1} - \frac{f(x_{n-1})}{f'(x_0)} - r$$
from where some manipulation yields
$$ e_n = \frac{e_{n-1}f'(x_0) - f(x_{n-1})}{f'(x_0)}$$
There appears to be no obvious Taylor expansion for me to use utilizing $x_0$ and $x_{n-1}$ while still obtaining an error term. For reference, for the usual Newton method, the text uses the expansion
$$ f(x_n - e_n) = f(x_n) + e_n f'(x_n) + \frac{e_n ^2 f''(\xi_n)}{2}$$
where $\xi_n$ is in some interval $(x_n - \delta_n, x_n + \delta_n)$. (The root $r$ is assumed to be in this interval.)
This is where I'm stuck. Any guidance would be appreciated.
If $x_n = x_{n-1} - \frac{f(x_{n-1})}{f'(x_{0})} $, then, since $f(x-h) =f(x)-hf'(x)+O(h^2) $,
$\begin{array}\\ f(x_n) &= f(x_{n-1} - \frac{f(x_{n-1})}{f'(x_{0})})\\ &\approx f(x_{n-1})- \frac{f(x_{n-1})}{f'(x_{0})}f'(x_{n-1})+O(f^2(x_{n-1}))\\ &= f(x_{n-1})(1- \frac{f'(x_{n-1})}{f'(x_{0})})+O(f^2(x_{n-1}))\\ \text{so}\\ \frac{f(x_n)}{f(x_{n-1})} &= 1- \frac{f'(x_{n-1})}{f'(x_{0})}+O(f(x_{n-1}))\\ \end{array} $,