Find the entropy of a Markov chain

423 Views Asked by At

For a regular finite Markov chain with transition matrix $$P = \begin{pmatrix} 0.5 & 0.5 & 0 \\ 0.5 & 0.25 & 0.25 \\ 0 & 0.5 & 0.5 \end{pmatrix}$$ the entropy is

$$H(P) = \sum^n_{i=1}p_iH_i$$ Where $p_i$ is the equilibrium probability of state $S_i$ and $H_i$ is the entropy of the $i$-th row of $P$.

Find the entropy for the model.

I have a feq questions here.

First of, I thought that we got the equilibrium probability vector $v^\text{T}$ by $$v^\text{T}P = v^\text{T}$$ Dont we get the same vector, $v^\text{T} = (0.4, 0.4, 0.2)$, for every $S_i$?

I would then calculate $H_i = -\sum^n_{i=1}x_i\text{log}_2x_i$ where $x_i$ is a row vector of $P$ and then just sum up the answer, i got. $$\begin{align*} H_1 &= -\text{log}_2 (0.5) = 1 \\ H_2 &= -0.5\cdot\text{log}_2 (0.125) = 1.5 \\ H_3 &= -\text{log}_2 (0.5) = 1 \end{align*}$$ So I would get $H(P) = (0.4, 0.4, 0.2)\cdot3.5$

I'm quite sure this answer is completely wrong. Any help or pointers is appreciated, thanks in advance.

1

There are 1 best solutions below

1
On BEST ANSWER

Indeed, the overall vector $v$ is the same no matter which state $S_i$ you're looking at. The point is that the $i$th entry of $v$ gives you the equilibrium probability of landing in the $i$th state, which is to say $p_i$.

So, you're almost there. From the vector $v^T = (0.4,0.4,0.2)$, we take $p_1 = 0.4, p_2 = 0.4, p_3 = 0.2$. Now, we can take $$ H(P) = \sum_i p_iH_i = 0.4 \cdot 1 + 0.4 \cdot 1.5 + 0.2 \cdot 1. $$