Topological Entropy of Tent Map

1k Views Asked by At

I am interested in the topological entropy of the map: $T_s(x) = min\{ sx, s(1-x)\}$, the tent map with slopes $\pm s$ and peak at $x = \frac{1}{2}$. For $s \in [1, 2]$.

When $s = 2$, this is the standard tent map and the topological entropy is log(2).

I do not need an exact answer, I am just wondering if either

a) The topological entropy is log(s)

b) The topological entropy is NOT log(s).

I have found one source that claims (without clear source) that the answer is log(s). I am inclined to believe it is probably NOT log(s), but I cannot find a clear argument for why this is the case. If you could provide any direction or intuition for either argument, that would be greatly appreciated.

Thank you in advance.

1

There are 1 best solutions below

4
On BEST ANSWER

Kind of a duplicate of toplogical entropy of general tent map, but OK.

As pointed out in the answer of the other question, up to a countable set (the backward orbits of $0$ and $1$), these tent maps are homeomorphically conjugate to a full shift on 2 symbols, therefore the topological entropy is $\log 2$.

The conjugation comes from the Markov partition given by $[0,1/s] \cup [1-1/s, 1]$.

Edit: this was for $s>2$.

Heuristic for why $h(T)=\log s$ if $1<s<2$:

Let's work with Bowen-Dinabourg's definition of the topological entropy in terms of the maximal number $N(n,\epsilon)$ of $(n, \epsilon)$ separated points.

Clearly, $N(0,\epsilon) \sim 1/\epsilon$ by taking regularly spaced points. Now notice that if you take points that are regularly spaced at intervals $\epsilon/s$, almost every pair is going to be $(1,\epsilon)$ separated: indeed, the distance between points on the same side of $1/2$ is going to be stretched by a factor $s$. For points on different sides of $1/2$, this is not necessarily true, but most of those points are starting far away from each other (ie at distance more than $\epsilon$) so they are also $(1,\epsilon)$ sepearated.

Similarly, you could convince yourself that $N(n,\epsilon) \sim s^n/\epsilon$. From there it follows that the entropy is $\log s$.