Conditional entropies of s stationary stochastic process is a decreasing sequence?

205 Views Asked by At

In Elements of Information Theory on p.75 is a proof of the following theorem:

For a stationary stochastic process, $H(X_n|X_{n-1}, \ldots, X_1)$ is nonincreasing in $n$ and has a a limit $H'(\mathcal{X})$.

In the proof, the following is stated:

"Since $H(X_n|X_{n-1}, \ldots, X_1)$ is a decreasing sequence of nonnegative numbers,..."

Where is this fact coming from? And why is the sequence decreasing? Isn't it non-increasing?

1

There are 1 best solutions below

0
On BEST ANSWER

The point is going to be that the conditional entropy is non-increasing : $H(X | Y) \leq H(X)$. (When you have access to more known information, the uncertainty cannot increase.) Then you will use stationarity to shift in time

$H(X_n | X_{n-1}, \ldots, X_1) \leq H(X_n | X_{n-1}, \ldots, X_2) = H(X_{n-1} | X_{n-2}, \ldots, X_1)$

(Yes, it is non-increasing. People are often vague with this language, calling $\leq$ and $<$ decreasing, unfortunately.)