Markov chains: interarrival time and hitting times

25 Views Asked by At

Suppose that $(X_{n})_{n\in\mathbb{N}}$ is a Markov Chain with state space $S$ and $A\subseteq S$ and define:

$$ T_{S} := \text{min}\{{n\geq0:X_{n}\in A }\} $$

As the hitting time to hit the set $A$.

How can we prove that $T_{S}$ is in $\Omega$, the sample space where each $X_{n}$ is defined? Given that, and $k_0 \in \mathbb{N^{+}}$ now how can we formally prove that:

$$ \{T_{S} = k_{0}\} = \{X_{0} \notin S, X_{1} \notin S,\dots, X_{k_0-1}\notin S, X_{k_0}\in S\} $$

I understand that $T_S$ (if it is defined) is a number, and in the previous line we'd have an event (namely, $T_S = k_0$), but I am not sure how to proceed. Normally we would take an element from one set and verify it belongs to the other set and the other way arround, but I am completely clueless how to express these elements in terms of the chain and the hitting time.

On the other hand, if for $n\in \mathbb{N}^{+}$ we define $N_{n}(S) $:= $\sum_{k=1}^{n}{1_{S}(X_{n})}$, then the hitting time of the $r$-th visit is:

$$ T_{S}^{r}:=\text{min}\{n\geq 0: N_{n}(S) = r\} $$

is there any way to express the event $\{T_{S}^{r} = k_{0}\}$ in a similar way to the first hitting time? In the end I'd like to formally prove that interarrival times (defined as differences of $r$-th visit times) are random i.i.d variables.