why is the entropy of a dynamical system interesting?

674 Views Asked by At

I took an introductory class on dynamical systems this semester. In class we've seen a great deal about the entropy of a dynamical system $T : X \to X$ on a metric space $(X, d)$. In the topological case the entropy is defined to be $$h(T) := \lim_{\epsilon \to 0 } \lim_{k \to \infty} \frac{1}{k} \log (\text{sep}(k, \epsilon, T)),$$ where $\text{sep}(k, \epsilon, T)$ is defined to be the maximal cardinality of a subset $B \subset X$ such that for all $x, y \in B$ there is a $i = 0, ..., k - 1$ such that $d(T^i(x), T^i(y)) > \epsilon$.

In class it was said that the entropy measures how "chaotic" or "random" a map is. However, we never really saw any theorem or application that made clear why people care about entropy. It seems a rather arbitrary definition to me.

Can anyone explain to me why the topological entropy of a dynamical system is interesting?

Thanks!

2

There are 2 best solutions below

0
On BEST ANSWER

Topological entropy is one of many invariants of topological conjugacy.

As all others it has two main applications:

$1)$ distinguish dynamics that are not topological conjugate (if $h(T)\ne h(S)$, then $T$ and $S$ are not topologically conjugate);

$2)$ detect complicated behavior (if $h(T)\ne0$, then $T$ has "complicated behavior").

The second aspect is extremely technical and is one of the difficult topics of the finite-dimensional dynamical systems theory. Let me formulate one rigorous result (among others), which may let us appreciate the interest in the notion, but also its tecnhical nature:

If $T$ is a $C^1$ diffeomorphism of a compact manifold and $h(T)>0$, then there exists an ergodic $T$-invariant probability measure with at least one positive Lyapunov exponent.

0
On

This is a bit more informal, but the following interpretation was helpful to me as a student:


Think of $\epsilon$ as the resolution with which you can view the dynamics: that is, if two points are closer than $\epsilon$, then you cannot distinguish them. Then, \begin{align*} \operatorname{sep}(k, \epsilon, T) \end{align*} can be thought of as the number of "distinguishable trajectories of length $k$". Positivity of the entropy $h(T)$ implies that for $\epsilon > 0$ sufficiently small, this quantity grows at a positive exponential rate.

What is the implication? Well, suppose you're modelling the dynamics of $T$ on a computer and want to capture all trajectories of length $n$. If $h(T) > 0$, then the amount of data required to do so grows exponentially in $n$, at a rate $\sim e^{n h(T)}$. This is why one often refers to entropy as capturing complexity of the system.