Consider an infinite sequence of characters $\ldots, x_{t-1}, x_t, x_{t+1}, \ldots$ which each belong to a finite alphabet and are generated according to an arbitrary stochastic process. For example, the alphabet may be the English alphabet and the sequence may be English text. Let $H$ be the average Shannon entropy of the process -- that is, the entropy of the distribution of $x_t$ conditioned on $(x_1, \ldots, x_{t-1})$ averaged across sufficiently large $t$.
Consider now the backwards process $\ldots, x_{t+1}, x_t, x_{t-1}, \ldots$ Let $H'$ be the Shannon entropy of the backwards process (i.e. the entropy conditioned on the "future" instead of the past). In the above example, this corresponds to the observed entropy when reading English text backwards.
For some processes (e.g. when all characters are sampled independently), $H = H'$. Is it the case that $H = H'$ for all processes? If not, is there a simple counterexample to this rule?