How to prove the divergence of a fixed point iteration, in the context of the power tower

463 Views Asked by At

EDIT 2:

I realise now that such conditions are quite context-dependent. To include the original context from which I considered this question, I was researching the convergence of the power tower, and read this article. The proof of the bounds of $x^{x^{x^{x^{\dots}}}}=y$ - with the results $y\in[e^{-1}, e]$, $x\in[e^{-e}, e^{1/e}]$ - relies on the assumption that the power tower converges if and only if the derivative of the power tower is less than one over the range of the sequence $\{x, x^x, x^{x^x},\dots\}$. I thought that surely the convergence criterion isn't an if and only if criterion here, thinking of examples such as the one I have showed below. In this very particular case of the power tower, can anyone clarify convergence/divergence from the fixed point perspective on the problem, considering the sequence defined by $x_n=(x_0)^{x_{n-1}}, f(x)=(x_0)^x.$

OP:

Consider the sequence $x_n=f(x_{n-1})$, and suppose that its derivative is bounded as follows: $\left|f\prime(x)\right|\leq\lambda$, and lastly suppose that it has some fixed point $f(\mu)=\mu$.

The very well-known proof of convergence, that uses the mean value theorem, for when $\lambda\lt1$ goes along the lines of:

$$\left|\mu-x_n\right|=\left|f(\mu)-f(x_{n-1})\right|=\left|f\prime(c_n)\right|\left|\mu-x_{n-1}\right|\leq\lambda\left|\mu-x_{n-1}\right|,c_n\in[\mu,x_{n-1}]$$

And inductively:

$$\left|\mu-x_n\right|\leq\lambda^n\left|\mu-x_0\right| \\ \implies \mu=x_n, \lambda\lt1$$

As $\lim_{n\to\infty}\lambda^n=0$ for non-negative $\lambda\lt1$.

My question is:

Suppose $\lambda\ge1$ and the derivative of $f$ cannot be always bounded to be less than one. Does this guarantee the iterative sequence always diverges? Or does it merely say that it may converge, and we just do not know for sure, since in the case of $\lambda=1$ we know that $\left|\mu-x_n\right|\le k$ for some value $k$, and the left hand side could therefore still be zero if we are lucky - the same reasoning I believe applies for the case when the right hand side of the inequality is infinity, when $\lambda$ is strictly greater than one.

The proof that it definitely converges given the bounded derivative relies on the fact that the final inequality must have zero on the right hand side - but it surely can have zero on the right hand side still, for less strictly bounded $\lambda$. In general, is there any rigorous, strict rule that determines the divergence of an iterative sequence? And is my reasoning flawed, with the $\lambda\lt1$ condition as an if and only if condition of convergence? Or am I right in thinking that that condition of convergence is a weak one, and not an absolute one?

For all these questions I discard the trivial case where the initial $x_0=\mu$.

Thanks, I couldn’t find any further details online.

EDIT:

An example for a case where $\lambda\ge1$ yet the iteration converges:

The red line is the function $y=x^2$, and the blue line is $y=x$. The green line is my sloppy MS Paint rendition of a cobweb-and-staircase diagram of a fixed point iteration. The purple shaded region is the region of points $x$ for which $f\prime(x)<1$, and I wish to highlight that my iteration began outside this region, hence the $\lambda$ for this sequence over the entire range of the sequence is greater than one - yet the sequence has clearly converged to a fixed point.

counter-example

I also wish to highlight that had I started in another region where $\lambda\gt1$, just a little after the fixed point of $x=1$, that sequence would have diverged - I believe the term would be that the point $x=1$ is a repulsive fixed point. For both fixed points $x=0, x=1$ the $\lambda$ is not necessarily $\lt1$, yet one attracts and one repulses. Is there perhaps an idea that the $\lambda$ need not be $\lt1$ necessarily, but it must be bounded as such in the neighbourhood of the fixed point? Can this be proven? Because, to my mind at least, that statement is markedly different to the original criterion for convergence I outlined before; indeed, the original criterion looks weak in the face of the alternative one I just stated.

1

There are 1 best solutions below

0
On

A somewhat trivial example: Consider $f(x)=\lambda x$. If $\lambda =1$, then clearly every point is a fixed point, and every iteration converges. On the other hand if $\lambda=-1$, then the only iteration that converges will be the one started at $x=0$ (the same thing will happen if you pick $|\lambda|>1$).

In other words, if you're not in the Fixed Point Theorem situation, there's nothing you can say without extra properties of the problem.