Let $I$ denote the unit interval and $\mu$ be the Lebesgue measure. Let $S:I\to I$ be the map defined as $S(x)=2x \pmod{1}$. Then it is known that for any measurable subset $A$ of $I$ we have $$ \lim_{n\to \infty} \mu(S^{-n}A\cap B) = \mu(A)\mu(B) $$
So if we fix a subset $A$, we see that $A$ gets "mixed out" in the interval by backwards iterates of $S$.
I would like to say that $A$ had a certain entropy in the beginning, but as $n$ increases the entropy of $S^{-n}(A)$ increases.
It's like what we read in not-so-rigorous thermodynamics: Start with a box with a partition. On one side of the partition is a gas, and the other side is vacuum. Once the partition is removed the gas takes up all the space. The entropy of the gas (whatever that means) increases in the process.
So my question is: Is there a rigorous formulation of the the notion of entropy I was trying to hint at in my example? Also, can the notions of ergodicity and mixing be recast in the language of entropy?
Thank you.
Entropy reflects uncertainty of the present if you have knowledge of the past. Consider, for instance, $\{0,1\}^\mathbb{Z}$ with $\frac{1}{2}-\frac{1}{2}$ measure and the partition $P := \{\{x : x_0 = 0\},\{x : x_0 = 1\}\}$. Then the join of $T^{-1}P,\dots,T^{-i}P$ is separating $x$'s based on $(x_{-i},\dots,x_{-1})$. This is the "past" and gives absolutely no information about which element of $P$ that $x$ is in (since $x_0$ is completely independent of $x_{-i},\dots,x_{-1}$). Therefore, the entropy is as large as possible: $\log 2$.
Strong mixing means that sets get spread out over time.
One kind of stupid difference between mixing and entropy is that mixing is relative to the total system. For example, if you take two disjoint systems of positive entropy, the entropy is still positive, but the total system is not mixing (it's not ergodic). If you impose ergodicity, and then ask about mixing and entropy, then I think neither one implies the other.
I knew I wrote more about entropy before, and I just tracked it down, and it was on one of your questions! See here.