Invertible 1D map cannot be chaotic

242 Views Asked by At

In a compendium I have on chaos theory it is claimed, but not proven, that an invertible 1D map $x(k+1)=f(x(k))$ cannot be chaotic, since it will not display sensitivity on initial conditions.

I assume then that they claim that the Lyapunov exponent for such a map is necessarily negative (for all initial conditions). How can one show this?

Typically we define the Lyapunov exponent as $$\lambda = \lim_{n \to \infty} \frac{1}{n}\sum_{k=0}^{n-1}\ln |f'(x(k))|. $$

It is not apparent to me how we can infer $\lambda < 0$ by knowing that $f$ is invertible.

1

There are 1 best solutions below

3
On BEST ANSWER

Looking at the Lyapunov exponents or sensitivity on initial conditions alone alone won’t suffice. For example for $f(x)=5x$, you obviously have $λ=\ln(5)$ (which is the correct result) and sensitivity on initial conditions.

However, this example is also not chaotic – the dynamics is not bounded. A typical chaotic dynamics achieves the combination of boundedness and a positive Lyapunov exponent by stretching on the local scale (positive Lyapnov exponent) and folding on the scale of the attractor/state space (boundedness). And the latter operation is not invertible.

A little bit more formal: Suppose $f$ is piecewise continuous, invertible, and chaotic with its dynamics bounded to the interval $I$. Then for every $k$, we can divide $I$ into non-overlapping intervals $I_1, …, I_n$ such that $f^k$ is continuous on the inside of each of them. For these, we have

$$|R| = \sum_{j=1}^n \left | I_j\right|.$$

Since $f$ is non-invertible, the $f(I_1), …, f(I_n)$ must be non-overlapping and we have:

$$\sum_{j=1}^n \left | \,f^k(I_j)\right| ≤ |R|.$$

Now if $|f'(·)|>1$ on some interval $J$, then $|f(J)| > |J|$. More generally, if $\sum_{k=0}^{n-1} \ln \left( \left |\, f'(\,f^k(·)) \right | \right ) > 0$ on $J$, then $|f^k(J)| > J$. If the dynamics is supposed to have a positive Lyapunov exponent, the aforementioned expanding of intervals must apply on average (and for sufficiently high $k$), i.e.:

$$ \sum_{j=1}^n \left | I_j\right| < \sum_{j=1}^n \left | \,f^k(I_j)\right| .$$

Putting the three equations together, we achieve the reductio ad absurdum:

$$ |R| = \sum_{j=1}^n \left | I_j\right| < \sum_{j=1}^n \left | \,f^k(I_j)\right| < |R| .$$