Entropy with Markov Process

179 Views Asked by At

Can somebody please explain or help me reach an answer to this. I plainly don't know where to start. Sorry for not providing my work because am just confused on what to do. If somebody can please provide explanation with steps in order to understand, would be really appreciate it.

enter image description here

1

There are 1 best solutions below

0
On BEST ANSWER

Conditional on $X_n = a$, the distribution of $X_{n+1}$ is $a$ with probability $0.7$ and $b$ with probability $0.3$. Therefore the conditional entropy is $$ H(X_{n+1} | X_n = a) = -0.7 \log(0.7) - 0.3 \log(0.3) \approx 0.611. $$ You can do a similar calculation for $H(X_{n+1} | X_n = b)$. Finally, the definition of $H(X_{n+1} | X_n)$ is \begin{align*} &\mathbb{P}(X_n = a) H(X_{n+1} | X_n = a) + \mathbb{P}(X_n = b) H(X_{n+1} | X_n = b) \\ = \ &0.4 H(X_{n+1} | X_n = a) + 0.6 H(X_{n+1} | X_n = b) \end{align*}