$P(T_0 <\infty) = 1 - \lim_{k \to \infty} P(T_0 = k)$?

46 Views Asked by At

I am struggeling to understand how to handle the expression $P(T_0 < \infty)$, where $T_0 := \min\{ n \geq 1: X_n = 0 \}$ for some Markov chain. I naively assumed, that $\displaystyle P(T_0 < \infty) = 1 - P(T_0 = \infty) = 1 - \lim_{k \to \infty} P(T_0 = k)$. Is this true in general or at least for independent random variables $(X_n)$?

1

There are 1 best solutions below

0
On BEST ANSWER

No, it's not true. For instance, suppose $P(T_0 = k) = \frac{1}{2^{k+1}}$ for $k =1,2,3,\dots$, and $P(T_0 = \infty) = 1/2$.

But what you can say is that $P(T_0 < \infty) = \lim_{k \to \infty} P(T_0 < k)$. This is the "continuity from below" property of a countably additive measure: $T_0(\omega)$ is finite if and only there exists an integer $k$ with $T_0(\omega) < k$. That is, $\{T_0 = \infty\} = \bigcup_{k=1}^\infty \{T_0 < k\}$. Also, if $T_0 < k$, then it is certainly less than $k+1$, so $\{T_0 < 1\} \subset \{T_0 < 2\} \subset \dots$ and the events are increasing.

By similar arguments, you can also say $$P(T_0 < \infty) = \lim_{k \to \infty} P(T_0 \le k) = 1 - \lim_{k \to \infty} P(T_0 > k) = 1 - \lim_{k \to \infty} P(T_0 \ge k).$$