Amount of information a hidden state can convey (HMM)

274 Views Asked by At

In this paper (Products of Hidden Markov Models, http://www.cs.toronto.edu/~hinton/absps/aistats_2001.pdf), the authors say that:

The hidden state of a single HMM can only convey log K bits of information about the recent history.

K being the number of states. I was trying to understand what they mean and came up with the following: imagine I have 4 states: <0-0>, <0-1>, <1-0>, <1-1>. These 4 states convey log(4)=2 bits of information, A and B, each of which can have values 0 or 1 (e.g. state with A=1 and B=0 is <1-0>).

However, I'm not sure if this is the correct reasoning.

1

There are 1 best solutions below

1
On BEST ANSWER

In $n$ bits you can encode up to $2^n$ states (that's the total amount of information available in $n$ bits).
In reverse, to encode $K$ states you'll need at least $log(K)$ bits.