Write probability of first return time in terms of first hitting time

169 Views Asked by At

For a time homogeneous Markov chain $(X_n)_{n\ge 0}$ with state space $I$ with no self loop . Given $X_0 = i \in I$ , define the first return time $T_i = \inf\{n\ge 1 : X_n = i\} $ and first hitting time $H_i = \inf\{n\ge 0 : X_n = i\} $ . I want to see whether the following equalities hold ,
$$ P(T_i < \infty | X_0 = i) = \sum_{j \in I} P(T_i < \infty | X_1 = j , X_0 = i)P(X_1=j|X_0=i) $$ $$ = \sum_{j \in I} P(T_i < \infty | X_1 = j)P(X_1=j|X_0=i) $$ $$ = \sum_{j \in I} P(H_i < \infty |X_0 = j)P(X_1=j|X_0=i) $$ The first equality seems to hold because of the union of disjoint sets in domain of Markov chain $\{T_i<\infty |\; X_0=i\} = \cup_{j\in I} \{T_i<\infty |\; X_0=i\}\cap\{X_1=j\}$ .

The second equality seems to hold because of Markov property .

The third equality seems to hold because of time homogeneity and seemingly $P(H_i<\infty|X_0=j\neq i) = P(T_i<\infty|X_0=j\neq i) $ , but I could not prove it .

2

There are 2 best solutions below

0
On BEST ANSWER

I think @Mason suggests to more explicitly write the stoping times in terms of random variable , so I try the following approach . Consider the expression on then RHS of the second equality in the original post , I first try to show ,

$$ \sum_{j\in I} P(T_i < \infty | X_1 = j) P(X_1=j | X_0 = i) = \sum_{j\in I} P(T_i < \infty | X_0 = j) P(X_1=j | X_0 = i) $$

The LHS $$ \sum_{j\in I} P(T_i < \infty | X_1 = j) P(X_1=j | X_0 = i) $$ $$ = \sum_{j\in I} \sum_{0<n<\infty} P(X_n = i \wedge X_k \neq i \; \forall \; 0<k<n | X_1 = j) P(X_1=j | X_0 = i) $$ by time homogeneity , $$ = \sum_{j\in I} \sum_{0< n<\infty} P(X_n = i \wedge X_k \neq i \; \forall \; 0<k<n | X_0= j) P(X_1=j | X_0 = i) + P(X_0 = i| X_0= j) P(X_1=j | X_0 = i) $$ $$ = \sum_{j\in I} \sum_{0< n<\infty} P(X_n = i \wedge X_k \neq i \; \forall \; 0<k<n | X_0= j) P(X_1=j | X_0 = i) $$ $$ = \sum_{j\in I} P(T_i < \infty | X_0 = j) P(X_1=j | X_0 = i) $$ Then I try to show for a fixed $j$ ,

$$ P(T_i<\infty|X_0=j\neq i) = P(H_i<\infty|X_0=j\neq i) $$

above is equivalent to (if $P(X_0 = j \neq i) \neq 0$) $$ P( \color {blue} {H_i<\infty} \wedge \color {orange} {X_0=j\neq i }) = P( \color {green} {T_i<\infty} \wedge \color {orange} {X_0=j\neq i }) \tag{1} $$ Suppose $X : \Omega \to \mathbb{R}$ and $\omega = (\omega_n)_{n\ge 0} \in \Omega$ such that $X_i(\omega) = X_i(\omega_i)$ .

Therefore , in $(1)$ $$ P( \color {blue} {H_i<\infty} \wedge \color {orange} {X_0=j\neq i }) $$ $$ = P \color {blue} {\bigcup_{0<n<\infty} \{\omega : X(\omega_n) = i \wedge X(\omega_k) \neq i \; \forall \; k < n \} \bigcup \{\omega : X(\omega_0) = i\}} \bigcap \color {orange} {\{\omega : X(\omega_0) = j\neq i\} } $$

$$ = P \bigcup_{0<n<\infty} \{\omega : X(\omega_n) = i \wedge X(\omega_k) \neq i \; \forall \; k < n \} \bigcap \color {orange} {\{\omega : X(\omega_0) = j\neq i\} } $$ $$ = P \color {green} {\bigcup_{0<n<\infty} \{\omega : X(\omega_n) = i \wedge X(\omega_k) \neq i \; \forall \; 0<k < n \} } \bigcap \color {orange} {\{\omega : X(\omega_0) = j\neq i\} } $$ $$ = P( \color {green} {T_i<\infty} \wedge \color {orange} {X_0=j\neq i }) $$

Because there's no self loop , it should always be true that "$j\neq i$" in our problem .

0
On

I'd write like this: \begin{align} P_i(T_i < \infty) &= \sum_{j \in I}P_i(T_i < \infty \mid X_1 = j)P_i(X_1 = j) \end{align} Now, in order to use the Markov property, we write $T_i$ as $T_i(X_{0 + \cdot})$. Note that $T_i(X_{0 + \cdot}) = H_i(X_{1 + \cdot}) + 1$, so $\{T_i(X_{0 + \cdot}) < \infty\} = \{H_i(X_{1 + \cdot}) < \infty\}$. Now we can formally apply the Markov property to get \begin{align} P_i(T_i(X_{0 + \cdot}) < \infty) &= \sum_{j \in I}P_i(T_i(X_{0 + \cdot}) < \infty \mid X_1 = j)P_i(X_1 = j) \\ &= \sum_{j \in I}P_i(H_i(X_{1 + \cdot}) < \infty \mid X_1 = j)p(i,j) \\ &= \sum_{j \in I}P_j(H_i(X_{0 + \cdot}) < \infty)p(i, j) \\ &= \sum_{j \in I}P_j(H_i < \infty)p(i, j). \end{align}