Let there be a bit-chain consisting of $n$ characters with the following rules:
The probabilities of the first character being $0$ and $1$ are $1/2$. From then on the probabilities are $p$ when the next bit is different and $1-p$ when the next bit is the same, i.e. $P(0_{n+1} |0_{n})=1-p$.
Find the entropy $H_n$ and find $\lim_{n\to\infty}\frac{1}{n} H_n$
Let the bit stream on $n$ characters be $(X_1, X_2, \dots, X_n)$
Recall the chain rule:
$$H_n:= H(X_1, X_2, \dots, X_n) = H(X_1) + H(X_2|X_1) + \dots + H(X_n|X_{n-1}, \dots, X_2, X_1)$$
Note that $\{X_i\}_{n\ge i\ge 1}$ is a Markov process. Consequently, $$H(X_i| X_{i-1}, \dots, X_1) = H(X_i|X_{i-1})$$
Lastly, it's easy to show by induction that $P(X_i = 0) = P(X_i = 1) = 1/2$.
Now, for every $i$,
\begin{align} H(X_i|X_{i-1}) &= \frac{1}{2} H(X_i|X_{i-1} = 0) + \frac{1}{2} H(X_i|X_{i-1} = 1)\\ &= 2 \times \frac{1}{2}h_2(p) = h_2(p) \end{align}
and $H(X_1) = 1$.
Putting it all together,
$$H(X_1^n) = 1 + (n-1)h_2(p).$$ The limit as $n \to \infty $ is easy to calculate.