I have gotten stuck trying to prove that iteration method converges faster than any geometric progression.
Background:
Assume that the function $g$ is continuously differentiable. Let $x^*$ be the solution to equation $x = g(x)$ i.e. $x^* = g(x^*)$. Let $x_n$ be be a sequence of iterations that has been found according to the rule $x_{n+1} = g(x_n), n = 0,1,2,\dots $
Claim:
If $g^\prime(x^*) = 0$ then for any $q>0$ it holds $\frac{|x_n - x^*|}{q^n} \to 0$ as $n\to \infty$.
My proof:
Let $g^\prime(x^*) = 0$ and some $q>0$ then $\lim_{n \to \infty} \frac{|x_n - x^*|}{q^n} = \lim_{n \to \infty} \frac{|g(x_{n-1}) - g(x^*)|}{q^n}$, then using the taylor formula for $g(x_{n-1})$ at the point $x^*$ we get that $g(x_{n-1}) = g(x^*) + \frac{g^\prime (x^*)}{1!}\cdot (x_{n-1} - x^*) + R_n(x_{n-1}, x^*)$ where $R_n(x_{n-1}, x^*)$ is the remainder for taylor series, we get that $\lim_{n \to \infty} \frac{|g(x_{n-1}) - g(x^*)|}{q^n} = \lim_{n \to \infty} \frac{|g(x^*) + \frac{g^\prime (x^*)}{1!}\cdot (x_{n-1} - x^*) + R_n(x_{n-1}, x^*) - g(x^*)|}{q^n} = \lim_{n \to \infty} \frac{|R_n(x_{n-1}, x^*)|}{q^n}$ since we assumed that $g^\prime(x^*) = 0$.
This is as far as I could do because although intuitively the remainder $R_n(x_{n-1}, x^*)$ should tend to $0$ as $n \to \infty$ and the denominator doesn't affect it since it tends to just some number that's not $0$, I am still not sure if I can conclude that $\lim_{n \to \infty} \frac{|R_n(x_{n-1}, x^*)|}{q^n} = 0$
If $g'(x^*)=0$ and the derivative is continuous, then there is a $\delta>0$ so that $|g'(x)|<\frac q{2}$ for $|x-x^*|< δ$. Now if $N$ is large enough so that $|x_n-x^*|< δ$ for $n\ge N$, then it follows from the mean value theorem that $$ |x_{n+1}-x^*|<\frac{q}2|x_n-x^*|<...<\left(\frac q2\right)^{n-N+1}|x_N-x^*|, $$ which implies the claim.