Functions that converge when repeatedly applied to itself

395 Views Asked by At

Suppose a function $f: \mathbb{R}\rightarrow\mathbb{R}$ with the property

$$ \lim_{n\rightarrow\infty}\; \underbrace{f\circ f\circ \dots \circ f}_{\mathrm{n \;times}}(x) = c = \mathrm{const.} $$

i.e. it converges in the limit to a finite number when repeatedly applied to itself. Also $\exists\, a, b \in \mathbb{R}, s.t. \; f(a)\neq f(b)$ should hold (i.e. only non-constant functions).

As pointed out in the comments $f(x) = \sqrt{|x|}$ is not a valid example. So I don't have any example function with that property but $f(x) = \frac{1}{x}$ on the other hand is bound though it doesn't converge.

Now I am interested in the following aspects:

  • Can one identify a subset of all functions for which this property is present (for any value of $c$)?
  • Do such functions have other specific properties that are perhaps common among all of them (and related to the value of $c$)?
  • Do functions exist for which $c = 0$ (perhaps $f(x) = \sin(x)$ but I'm not sure how to approach this)?
  • Do functions exist for which $c \neq 0$?

I am particularly interested in the case $c = 0$.

2

There are 2 best solutions below

2
On

your square root example fails at zero.

This is a simple class of examples, by no means exhaustive:

take any differentiable function such that there is a constant $\varepsilon > 0$ with $$ | f'(x) | \leq 1 - \varepsilon $$ for all $x \in \mathbb R$

Proposition 1: there is at most one fixpoint.

Proposition 2: there is a point with $f(a) < 0$ and a point with $f(b) > 0$

Proposition 3: there is a fixpoint $c$

Proposition 4: the fixpoint is globally attracting

0
On

The most common tool for dealing with these is the Banach fixed point theorem:

Let $f$ be a function from an interval $I$ to itself, and suppose there is some $k<1$ such that $|f(x)-f(y)|\le k|x-y|$. Then there is a unique fixed point $a$ such that $f(a)=a$. Also, for any initial $x_0\in I$, the sequence $x_n=f(x_{n-1})=f^{n}(x_0)$ converges to $a$.
With minor tweaks, that interval $I$ can be replaced by any complete metric space.

So, what does this mean here, with your functions implicitly from $\mathbb{R}$ to itself? We can ensure that Lipschitz condition by bounding the derivative; if $|f'(x)|\le k$ for all $x$, then $|f(x)-f(y)|\le k|x-y|$ as well.

That's what I have to say on your first bullet point.

For your final batch of questions, here's a function for which the iterates converge to any given $c$: $$f_c(x)=\frac{x}{2}+\frac{c}{2}$$

For the second bullet point - the Banach theorem doesn't have a full converse. We can have that convergence without those conditions, as $\sin$ and $\cos$ demonstrate (Yes, the iterates of $\sin$ converge to zero. Very slowly). There is a partial converse; if the iterates $f^n$ converge to a fixed point $a$ in some neighborhood of $a$ and $f$ is differentiable, then $|f'(a)|\le 1$. We need that, or the iterates will keep getting farther away.