Calculating the information per symbol of a markov chain source

338 Views Asked by At

I have a 4-state 2nd order markov chain source with symbols 0 and 1.

I have all the transition probabilities and have worked out the probabilities of each state.

How do I go about finding the amount of information of a bigram origination from this source? Would I be correct in thinking it would be the sum of -(transition probability x (log(base 2) transition probability))? And then the information per symbol would be this value divided by 2?

Thanks.

1

There are 1 best solutions below

2
On

What you need to find is the entropy rate of the Markov chain.

Let $X_t$ denote the bigram ($00$, $01$, $10$ or $11$) at time $t$. Assuming the Markov chain is stationary, the entropy rate $\mathcal{H}$ is given by $$ \begin{align*} \mathcal{H} &= H\left( \left.X_t\right|X_{t-1}\right) \\ &= \sum_{x} \mathbb{P}\left( X_{t-1}=x\right)H\left(\left.X_t\right|X_{t-1}=x \right)\\ &= \sum_{x} \mu(x)\left(-\sum_{x'} P(x',x)\log P(x',x)\right)\\ &= -\sum_{x,x'}\mu(x) P(x',x)\log P(x',x)\\ \end{align*} $$ where $\mu(x)$ is the stationary distribution, and $P(x',x)$ is the transition probability from state $x$ to state $x'$.

Information per symbol is then, as you suggested, entropy rate divided by 2.