Let $x_{n}$ a stationary discrete markov chain . I have to prove that:
For $n\geq 1$ $$ H(X_{n} | X_{0}) \geq H(X_{n-1} | X_{0})$$ $$ H(X_{0} | X_{n}) \geq H(X_{0} | X_{n-1})$$
Where $H(X)$ is the Shannon entropy.
The second inequality seems to me that is a corollary of the first one because if $$X_{1} \rightarrow X_{2} \rightarrow X_{3} \rightarrow ...$$ Then
$$X_{1} \leftarrow X_{2} \leftarrow X_{3} \leftarrow ... $$ (I am not completely sure).
But why do I have as hyphotesis the stationary distribution?
Maybe you can help me with this problem.
Just apply the data processing inequality:
$$I(X_0; X_{n-1})\ge I(X_0; X_n)$$
Equivalently
$$H(X_{n-1}) - H(X_{n-1}|X_0) \ge H(X_n) - H(X_n | X_0)$$
As $X$ is stationary, $X_n$ and $X_{n-1}$ have the same distribution, hence $H(X_n)=H(X_{n-1})$, and the first inequality follows.
For the second one, what is $X$? Anyway, it should follow in a similar manner...
The stationary hypothesis cannot be omitted:
Let the process be that there are only two states $\{0, 1\}$. In state $0$, it stays $0$ with a $\frac{1}{2}$ probability, and in state $1$, it stays $1$ with probability $1$. Let $X_0=0$ with certainty (so $H(X|X_0)=H(X)$ for any $X$), it's clear that $X_1$ has entropy $1$. As time goes by, the probability of $X$ being $0$ is diminishing, hence the entropy is decreasing instead of increasing.