Expected number of visits to a state of a Markov chain up to a certain time

1.3k Views Asked by At

Let $P=\{p_{ij}\}$ be a stochastic matrix (with rows and columns indexed by a countable set) and let $p^{(k)}_{ij}$ be the entries of $P^k$. I'm trying to prove that, if the associated Markov chain is initially at the state $j$ (i.e., $X_0=j$), and $V_n$ is the number of visits to state $j$ up to time $n$, then $$ E[V_n]=\sum_{k=1}^np^{(k)}_{jj}. $$ This should be simple but I'm stuck.

1

There are 1 best solutions below

1
On BEST ANSWER

Let $Y_k$ be a random variable which is 1 if $X_k = j$ and 0 otherwise. Observe that $V_n = \sum_{k=1}^n Y_k$. Then use linearity of expectation.