Assume we have a 2 state Markov chain with the transition matrix: $$ \left[ \begin{array} (p & 1-p\\ 1-q & q \end{array} \right] $$ and we assume that the first state is the starting state. What is the expected number of state transitions in $T$ periods? (I want to count the number of state changes from state 1 to state 2, and the other way round).
2026-04-26 08:47:12.1777193232
On
How can I calculate the expected number of changes of state of a discrete-time Markov chain?
1.7k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Denote by $n_{i,T}$, $i=1,2$, the expected number of transitions in $T$ periods when starting from state $i$. You can set up the recursions $$ n_{1,T} = p\, n_{1,T-1}+(1-p)(1+\,n_{2,T-1}),\quad n_{2,T} =q\, n_{2,T-1}+(1-q)(1+\,n_{1,T-1}) $$ with the obvious initial condition $n_{1,1}=1-p$ and $n_{2,1}=1-q$.
In matrix notation, this becomes $$ \mathbf{n}_T = \mathbf{1}+\mathbf{M}\mathbf{n}_{T-1}, $$ where $\mathbf{n}_T=(n_{1,T},n_{2,T})^T$, $\mathbf{1}=(1-p,1-q)^T$, and $\mathbf{M}$ is your transition matrix.
Can you solve this recursion?
1. Let $s=(2-p-q)^{-1}$, then $\pi_0=(1-q)s$, $\pi_1=(1-p)s$, defines a stationary distribution $\pi$. If the initial distribution is $\pi$, at each step the distribution is $\pi$ hence the probability that a jump occurs is $$ r=(1-p)\pi_0+(1-q)\pi_1=2(1-p)(1-q)s. $$ In particular, the mean number of jumps during $T$ periods is exactly $rT$.
2. By a coupling argument, for any distribution $\nu$, the difference between the number of jumps of the Markov chain with initial distribution $\nu$ and the number of jumps of the Markov chain with initial distribution $\pi$ is exponentially integrable.
3. Hence, for any starting distribution, the mean number of jumps during $T$ periods is $rT+O(1)$.
4. Note that $$ \frac1r=\frac12\left(\frac1{1-p}+\frac1{1-q}\right), $$ and that $\frac1{1-p}$ and $\frac1{1-q}$ are the mean lengths of the intervals during which the chain stays at $0$ and $1$ before a transition occurs to $1$ and $0$ respectively. Hence the formula. For a Markov chain with $n$ states and transition matrix $Q$, one would get $$ \frac1r=\frac1n\sum_{k=1}^n\frac1{1-Q_{kk}}. $$