Assume we have a two-state Markov chain, with $s_1$ and $s_0$ denoting the two states. The initial state of the Markov chain is either $s_1$ or $s_0$ with probability $p_1$ or $p_0$, respectively. The transition probability between states $s_i$ and $s_j$ is $p_{ij}$.
My question is, given the initial state probability $p_1$ for state $s_1$, which is the probability that after $L$ steps, state $s_1$ has been visited more than $N$ times? Notice that remaining in state $s_1$ counts as an other visit to this state.
I thought that this was a quite straight forward question, but after some research in probability and stochastic processes books I found no theorem with this result. I tried to proof it my self, but I the solution looks like being quite tedious.
I would really appreciate if somebody can point me the right direction in how to calculate this probability.
cheers
Pol
If you are in state 1 then the chance that you will move to state 2 is a geometric distribution with parameter $p_{21}$. Similarly a change from 1 to 2 is a geometric distribution with parameter $p_{12}.
If you start in State 1 (assuming this counts as a visit), then to visit more than $N$ times you need to leave this state and return at least $N$ times. Alternatively, if you start in State 2, you need to leave this state at least $N+1$ times and return $N$ times.
Now the sum of geometric distributions is a negative binomial distribution so your probability is
$$P(n>N)={p_1 \sum_{k=N}^{L-N}NB(N,p_{21})\sum_{j=k}^{L-N}NB(N,p_{12})} + {(1-p_1) \sum_{k=N}^{L-N}NB(N,p_{21})\sum_{j=k}^{L-(N+1)}NB(N,p_{12})} $$
Where $NB$ is the negative binomial probability for $k$ and $j$ as applicable.
Note this is the solution to the question as originally posed, where staying in state $p_1$ does not count as a visit.
For the revised question, the problem can be restated as not being in State 2 for more than $L-N$ steps. A modification of the above should get you there.