How to show average number of changes in a two-states Markov Chain?

29 Views Asked by At

Assuming we have two states $0$ and $1$. The probability to stay in the same state is $\frac{3}{4}$ and $\frac{1}{4}$ to go into the other state.

Or simply put as a probabilistic transition matrix $P$:

\begin{align*} P = \begin{bmatrix} \frac{3}{4} & \frac{1}{4} \\ \frac{1}{4} & \frac{3}{4} \end{bmatrix} \end{align*}

We're also given $p(x_0,\ldots,x_n)$:

\begin{align*} p_{n+1}(x_0,\ldots,x_n) = \frac{1}{2} \Big(\frac{1}{4}\Big)^w \Big(\frac{3}{4} \Big)^{n-w} \end{align*}

Where $w$ is the number changes from $0 \rightarrow 1$ or $1 \rightarrow 0$ in $x_0,\ldots,x_n$.

Using the definition of the asymptotic equipartition property (AEP):

\begin{align*} A_\epsilon^{(n+1)} := \Big| - \frac{1}{n} \log_2 p(x_0,\ldots,x_n) - H(X) \Big| < \epsilon \end{align*}

Where

\begin{align*} H(X) = - \frac{1}{4} \log_2 \frac{1}{4} - \frac{3}{4} \log_2 \frac{3}{4} \end{align*}

is the entropy of $X$.

I should be able to show that $\frac{w}{n} \in \Big(\frac{1}{4} - \epsilon', \frac{1}{4} + \epsilon' \Big)$ where $\frac{w}{n}$ is the average number of changes between the states $0$ and $1$. Unfortunately I am failing to do so. Could anybody show me how I can get to this result?