Let $X_t$ be a continuous time (irreducible) Markov process on a set of states $\{1,\dots,n\}$. Fix $T>0$. Is it possible for $X_t$ to transition an infinite number of times on $[0,T]$? My guess is "yes" but with probability zero.
The reason is that the sequence of holding times, or time spent in each state, necessarily must decrease to zero in order for their infinite sum to converge. However since the holding times all have a nonzero average, this event becomes increasingly unlikely, with probability converging to zero. Can anyone verify/prove or disprove this claim? Thanks.
Fix a state $x$, and let $A_x$ be the event that there are infinitely many visits to $x$ in finite time. Let $T_n$ be the time of the $n$th visit to $x$. Since the chain is finite and irreducible, all the $T_n$ are a.s. finite, and by the strong Markov property, $R_n = T_n - T_{n-1}$ are iid positive random variables. Now for $\omega \in A_x$, the limit of $T_n(\omega)$ is finite, which is to say $\sum_{n=1}^\infty R_n(\omega) < \infty$. But by a Borel-Cantelli argument, a sum of nontrivial iid random variables must diverge almost surely. So $P(A_x) = 0$.
Since this holds for all states $x$, of which there are finitely many, we have $P(\bigcup_x A_x) = 0$. That is to say, almost surely, no state is visited infinitely often in finite time. If the chain were to transition infinitely often in finite time, by pigeonhole some state would be visited infinitely often, and we just showed this happens with probability zero.