Markov strong property exercise

252 Views Asked by At

Let $(X_n)$ be a Markov chain with Q being its transition matrix.
Let $T=\inf\{n \ge 0:X_n \in A\}$ and Let $u(x)=P_x(T<+\infty)$.

Prove that $u$ verifies the system : $$ \begin{cases}{} u(x)=1 &\text{if } x\in A \\ u(x)=Pu(x) &\text{if } x\notin A \end{cases} . $$

My attempt and understanding :

My understanding is that $T$ represents "the first time we get to the subset $A$" and $u(x)$ represents "the probability of hitting the subset $A$ starting from a $X=x$" ( because $P_x(T=+\infty)$ should represent the probability of never getting in the subset $A$).

That being said, the first part of the system makes sense because if $(X=x) \in A$ then $T=0$ ( smallest $n$ ) and we are already in $A$ so the probability of getting to $A$ should be equal to $1$.

The problem is the second part, how am I going to use markov strong property to prove that?

2

There are 2 best solutions below

0
On BEST ANSWER

Let $P=(p_{ij})$ and let $I$ be the state space. Suppose $X_0=i\not\in A$, so $T\geq1$. By the usual Markov property we have $$\mathbb P_i(T<\infty\mid X_1=j)=\mathbb P_j(T<\infty)=u(j).$$ Using the law of total probability, we can condition on the possible values of $X_1$ to get \begin{align*} u(i)=\mathbb P_i(T<\infty)=\sum_{j\in I}\mathbb P_i(T<\infty\mid X_1=j)\cdot\mathbb P_i(X_1=j)=\sum_{j\in I}p_{ij}\cdot u(j)=(Pu)(i). \end{align*}

0
On

The starting point is that if $x \not \in A$ then $P(T<\infty \mid X_0=x) =\sum_y P(T<\infty \mid X_0=x,X_1=y) P(X_1=y \mid X_0=x)$. This follows from using the total probability formula via conditioning on the outcome of the first step. Then you need to simplify that using the other properties of the situation.