Prove of disprove: Let $\{X_n\}_n$ be homogenous Markov chain. if we start from state $i$, there is a positive probability that we pass at least 3 times at state $j$. Does it follows that exists positive probability s.t if we start in state $i$, we pass at state $j$ at least 5 times?
That seems correct but I can't explain it formally. How can I prove it ?