Two definitions of topological entropy: Why do they coincide?

341 Views Asked by At

I guess you all know the definition of topological entropy by using open covers for $X$ being a compact topological space and $T\colon X\to X$ being a continuous map (for example given in Walters' "An Introduction to Ergodic Theory"); here is a recap if not:

If $\alpha,\beta$ are open covers of $X$ their join $\alpha\vee\beta$ is the open cover by all sets of the form $A\cap B$ where $A\in\alpha, B\in\beta$, Similarly we can define the join $\bigvee_{i=1}^n\alpha_i$ of any finite collection of open covers of $X$.

First one defines the entropy of an open cover of $X$ as follows.

If $\alpha$ is an open cover of $X$ let $N(\alpha)$ denote the number of sets in a finite subcover of $\alpha$ with smallest cardinality. We define the entropy of $\alpha$ by $H(\alpha)=\ln N(\alpha)$.

Now one defines the entropy of $T$ relative to $\alpha$.

If $\alpha$ is an open cover of $X$ and $T\colon X\to X$ is a continuous map then the entropy of $T$ relative to $\alpha$ is given by $$ h(T,\alpha)=\lim_{n\to\infty}\frac{1}{n} H\left(\bigvee_{i=0}^{n-1}T^{-i}\alpha\right). $$

Finally, one can define the toplogical entropy of $T$:

If $T\colon X\to X$ is continuous, the topological entropy of $T$ is given by $$ h(T):=\sup_{\alpha}h(T,\alpha), $$ where $\alpha$ ranges over all open covers of $X$.


Now for a special case another definition of topological entropy is given and it is said that this definition coincides with that given above.

Consider $X=\left\{0,1,2\right\}^{\mathbb{Z}}$ with the product topology. Because of Tychonoff, this is a compact topological space. Moreover, let $T\colon X\to X$ describe the following dynamics.

Let $\eta_n\in X$ denote the state of the process at time $n$, i.e. $\eta_n(x)$ denotes the state of the cell at location $x$ at time $n$.

A cell with a 1 becomes a 2 in the next step, a 2 becomes a 0 in the next step and a 0 becomes a 1 in the next step iff at least one of the two neighbour cells (to the left and to the right) is 1.

For this situation the following definition of topological entropy is given in "Some Rigorous Results for the Greenberg-Hastings Model" by Durrett and Steif:

Let $a_{n,m}$ be the number of $0,1,2$-valued configurations $\sigma$ on $[-m,m]\times [0,n-1]$ that can be extended to a possible evolution of the dynamics, that is, such that there exists $\sigma'\in\left\{0,1,2\right\}^{\mathbb{Z}^2}$ with $\sigma'=\sigma$ on $[-m,m]\times [0,n-1]$ and with $T(i\text{th row of }\sigma')=i+1\text{st row of }\sigma'$. The topological entropy is equivalently defined to be $$ h(X,T):=\sup_m\limsup_{n\to\infty}(\ln a_{n,m})/n. $$

Durrett and Steif say that in this special situation, both definitions coincide.


Unfortunately, I do not see why both definitions coincide here, i.e. why in the mentioned special case that $X=\left\{0,1,2\right\}^{\mathbb{Z}}$ and $T$ as above it is $$ \sup_{\alpha}h(T,\alpha)=\sup_m\limsup_{n\to\infty}\frac{1}{n}\ln a_{n,m}. $$

Do you see that and if yes could you please explain that to me?

Thank you and with greetings

----- An idea

Consider the open covering $$ U_m=\bigcup_{x_j\in\left\{0,1,2\right\}, j=-m,-m+1,...,m}[x_{-m},...,x_{m}], $$ where with $[x_{-m},...,x_m]$ I mean cylinder sets. This is an open cover of $X=\left\{0,1,2\right\}^{\mathbb{Z}}$. Now I think one can use the following metric on $X$: $$ d(x,y)=2^{-n},\text{ where }n:=\sup\left\{k\geq 0: \forall -k<i<k: x_i=y_i\right\}. $$

Now there is the following result in Walters (p. 173, Theorem 7.6 in my edition):

Let $(X,d)$ be a compact metric space. If $\left\{\alpha_n\right\}_{n=1}^{\infty}$ is a sequence of open covers of $X$ with $\text{diam}(\alpha_n)\to 0$ then if $h(T)<\infty$ $\lim_{n\to\infty}h(T,\alpha_n)$ exists and equals $h(T)$.

As far as I see the series $\left\{U_m\right\}$ fullfills that when using the metric above. So $$ h(T)=\lim_{m\to\infty}h(T,U_m). $$ With $U_m^n:=U_m\vee T^{-1}U_m\vee\cdots\vee T^{-(n-1)}U_m$ I think it then is $$ h(T,U_m)=\lim_{n\to\infty}\frac{1}{n}\log N(U_m^n), $$ and therefore $$ h(T)=\lim_{m\to\infty}\lim_{n\to\infty}\frac{1}{n}\log N(U_m^n). $$

Because the sequence $(a_m)$ with $a_m:=h(T,U_m)$ monotonically increases and is bounded, one can replace $\lim_m$ by $\sup_m$.

So what I have until now is that $$ h(T)=\sup_m h(T,U_m). $$

It remains to show that for any $m$ it is $$ h(T,U_m)=\lim_{n\to\infty}\frac{1}{n}\log a_{n,m}. $$

Do you know how to prove that equality?

1

There are 1 best solutions below

3
On BEST ANSWER

To compute $h(U_m)$, you can consider the corresponding the shift on $Y\subset(\{0,1,2\}^{\mathbb{Z}})^{\mathbb{N}}$ which $Y=\{(x,Tx,T^2x,\ldots):x\in X\}$. You can see that $(X,T)$ is conjugated to $(Y,\sigma)$ where $\sigma$ is the left shift. Note that the partition $U_m$ is corresponding to the partition of $Y$ into sets of the form $$\{(x,Tx,T^2x,\ldots):x\in X,x_{[-m,m]}=\{a_{-m},\ldots,a_m\}\}$$ From this fact one can see that the $N(U^n_m)$ is equal to the number of words of length $n$ appear in the space $$\{(x_{[-m,m]},(Tx)_{[-m,m]},T^2x_{[-m,m]},\ldots):x\in X\},$$ which is $a_{n,m}$. So $$h(T,U_m)=\lim_{n\to\infty}\frac{1}{n}\log a_{n,m}.$$