How does a Markov process inherit its homogeneity to the embedded Markov chain?

87 Views Asked by At

A homogenous Markov process $\lbrace X(t),t\geq 0\rbrace $ is given and the embedded Markov chain $Y_0,Y_1,\ldots$ is defined as $Y_n:=X(T_n)$, where the $0=T_0<T_1<\ldots$ are the moments where the Markov process changes its state (we consider that we have just finite changes in a finite interval) $D_n:=T_n-T_{n-1}$. Now in my book is said that the embedded Markov chain is homogeneous. I tried several hours to show it, but I failed. The homogeneity of the process means $$P(X(s+t)=j|X(s)=i)=P(X(t)=j|X(0)=i)$$ for all $s,t\geq 0$ and the homogeneity of the chain means $$P(Y_n=j|Y_{n-1}=i)=P(Y_1=j|Y_0=i) \forall n \in \mathbb{N}_0.$$ Using the homogeneity of the process (subtracting $T_{n-1}$) it remains to show: $$P(X(D_n)=j|X(0)=i)=P(X(D_1)=j|X(0)=i)$$ My idea was to use that $$P(D_n>t|X(T_{n-1})=i)=e^{-\alpha_i t}=P(D_1>t|X(0)=i),$$ which is given by a former theorem. Can somebody help me?

1

There are 1 best solutions below

1
On BEST ANSWER

If we know the process is in state $i$ after $k$ jumps and want to know the probability it will be in state $j$ after a further $n$ total jumps, we just need to know the probability of reaching state $j$ from state $i$ in $n$ jumps. We have $$ \begin{aligned} P({Y_{n+k}}=j\mid {Y_{k}}=i)&=P(X_{T_{n+k}}=j\mid X_{T_{k}}=i) \\ &= P\left(\text{$X$ goes from state $i$ to $j$ in exactly $n$ jumps...} \right.\\ & \left.\quad\quad\quad\quad\quad\quad\quad\quad\quad \mid \text{state $i$ reached after $k$ jumps}\right) \\ &= P\left(\text{$X$ goes from state $i$ to $j$ in exactly $n$ jumps} \right) \\ &=P(X_{T_{n}}=j\mid X_{T_0}=i) \\ &=P({Y_{n}}=j\mid {Y_0}=i). \end{aligned} $$

The number of jumps to reach state $j$ from state $i$ does not depend on when state $i$ was entered or on how many jumps it took to reach state $i.$