A question of determining when the entropy is maximum.

81 Views Asked by At

Y ={ 1, 2,...,r}

We are given that X is the set of two sided sequences with entries from Y and T is the two sided shift on X, and m is a T invariant probability measure on X.

If $p_i = m(\{x \epsilon X | x_0 = i\})$, and h(m) is the entropy of the dynamical system described above then it is required to show that h(m) $\leq$ $\sum_{n=1}^{r} p_i log p_i$ and that equality holds exactly when m is the product measure on X derived from assigning the probability $p_i$ to i in the space Y.

So I thought about adopting the technique used in Peter Walters' book where he calculates the entropy for the Bernoulli shift, but then I got stuck at the expression for H.

I believe I am to find out when this is maximum, but I'm not sure how to proceed. Any help is appreciated.

Thank you!

1

There are 1 best solutions below

5
On

Recall that the entropy of $m$ is equal to $$ h_m=\sup_\xi\inf_{n\in\mathbb N}\frac{H_m(\xi_n)}{n}, $$ where the supremum is taken over all measurable partitions $\xi$ and where $\xi_n=\bigvee_{k=0}^{n-1}T^{-k}\xi$. But since the partition $\xi=\eta$ by cylinder sets of length $1$ is a generator, we have $$ h_m=\inf_{n\in\mathbb N}\frac{H_m(\eta_n)}{n}\le H_m(\eta)=-\sum_{i=1}^r p_i\log p_i $$ (taking $n=1$ and so $\eta_n=\eta$). Of course, equality holds when $m$ is the Bernoulli measure since you know that the last expression is precisely the entropy of this measure.