Relationship between entropy and ergodicity

2.4k Views Asked by At

Are there any direct connections between entropy and ergodicity? For example, does knowing that $(X,\mathcal{S},\mu,T)$ is ergodic help in computing the entropy?

I know that there are some indirect connections. For example, measure-theoretic entropy was developed as an isomorphism invariant to show that the $(p,1-p)$- and $(q,1-q)$-Bernoulli shifts are non-isomorphic when $p\neq q \neq 1-p$. But it seems merely coincidental that these shifts happen to be ergodic.

What I'm looking for is something like "If $(X,\mathcal{S},\mu,T)$ is ergodic then [something about entropy]." OR "If [something about entropy] then $(X,\mathcal{S},\mu,T)$ is ergodic."

EDIT Per the request in the comments, I'm interested in the measure-theoretic entropy of a measure-preserving dynamical system.

3

There are 3 best solutions below

2
On

I'm not sure it is along the lines you wished for but I'll comment anyhow:

The fact that the "entropy respect the ergodic decomposition" i.e. if $$ \mu =\int \mu_xd\nu(x)$$ implies $$ h_\mu (T)=\int h_{\mu_x} (T) d\nu(x)$$

(for a reference see

http://www.math.ethz.ch/~einsiedl/Pisa-Ein-Lin.pdf

section 3.5)

have consequences more or less in the form that you want. For example that if there is a unique measure of maximal entropy then it is ergodic. This quite trivial fact if used a lot.

9
On

The Shannon-McMillan-Breiman $(SMB)$ theorem is along the lines "if $(X,\mathcal{S},\mu,T)$ is ergodic then (something about entropy)". It extends the "Asymptotic Equipartition Property" (which initially applies to i.i.d. processes) to hold for ergodic processes. One version of the $SMB$ theorem is:
Let a stochastic process $\mathcal X=\{X_i,\;i=0,...\}$. Define the entropy (or entropy rate) of a stochastic process as $H(\mathcal X)=\lim_{n\rightarrow \infty} \frac 1n H(X_0,...X_n)$, where $H(X_0,...X_n)$ is the joint entropy. Define also the conditional $k$th-order entropy as $H^k= H(X_k|X_{k-1},...X_0)$, and the conditional entropy rate $H_c(\mathcal X)=\lim_{n\rightarrow \infty}H^n=\lim_{n\rightarrow \infty} H(X_n|X_{n-1},...X_0)$ . For completenes, for a random variable $Y, H(Y) = -E\log (Y) $ and we denote by $p(Y)$ its probability density. Then if the process is ergodic we obtain that:

$$-\frac1n\log p(X_0,...,X_{n-1})\rightarrow H(\mathcal X) = H_c(\mathcal X) \;\text{with probability 1}$$

The proof is a bit long, and after all, this is not my result. Some references : 1) Thomas & Cover (of course), "Elements of Information Theory" 2n ed. chapter 16 , 2) Algoet, P. H., & Cover, T. M. (1988). A sandwich proof of the Shannon-McMillan-Breiman theorem. The annals of probability, 899-909.

3
On

In a very general and broad context, as the question is set, the answer is; that the so called Fokker Planck equations (FPE) and respective Master equations (ME) establish a deep relation between entropies and ergodicity (in some cases). The concrete formulations to be deducted then, are strongly dependent upon what type of system and what type entropy, you have in focus. For instance and as a very specfic view, in some stochastic dynamic systems a connector can be the Lyapunov Exponent being on one side related to entropy production (information gain) and on the other side to the stability characteristics that embody ergodicity aspects (such as when the FPE or ME translatable into higher order Markov Chains).

A good example how deep and diverse the answer to your question in this sense can be, motivates one to highlight the example of Phi-entropy inequalities and Fokker-Planck equations. Bolley and Gentil have an excellent excourse on this and the part 2 of their 2010 article is such a formulation that would challenge perfectly your likely humoristic cliché of "is ergodic then [something about entropy]" vice versa.

Honestly said I think your question is sparkling out without required respect of research what is commonly expetced in this platform and at this level of theoretical elegance. Also dont understand why we around the world should be so much being beloved to have your autogram. Rather preferable you take more time to formulate such a question diligently and with some strong sensus of direction and insight. That would help us better being dedicated to rather specfic answers.