How to prove that Markov chain with specific transition probabilities has independent increments?

162 Views Asked by At

I have Markov chain $N=\{N(t) \mid t\geq 0 \}$ with the state space $\{0,1,2,\dots\}$. I know that it is homogeneous and transition probabilities are: $$ p_{ij}(s,t)=P(N(t)=j\mid N(s)=i) = p_{i,j}(t-s)\left\{ \begin{array}{ll} \frac{(\lambda (t-s))^{j-i}}{(j-i)!} e^{-\lambda (t-s)} & \quad j\geq i\\ 0 & \quad otherwise \end{array} \right. $$ and $p_i(0)=P(N(0)=i)=\delta_{i0}$.

How do I prove that this process has independent increments?

Thank you.

2

There are 2 best solutions below

0
On BEST ANSWER

I think I got it.

$$P(N(t_1)=x_1,N(t_2)-N(t_1)=x_2,\dots,N(t_m)-N(t_{m-1})=x_m)= P \left( N(t_i)=\sum\limits_{j=1}^i x_j \text{ for } \forall i=1\dots m \right) =P \left( N(t_m)=x_1+\dots+x_m \mid N(t_i)=\sum\limits_{j=1}^i x_j ~ \forall ~ i=1\dots m-1 \right) P \left( N(t_i)=\sum\limits_{j=1}^i x_j ~ \forall ~ i=1\dots m-1 \right) =[\text{Markov property}]=P(N(t_m)=x_1+\dots+x_m \mid N(t_{m-1})=x_1+\dots+x_{m-1})P \left( N(t_i)=\sum\limits_{j=1}^i x_j ~ \forall i=1\dots m-1 \right) =p_{x_1+\dots +_{m-1},x_1+\dots+x_m}(t_{m-1},t_m)P \left( N(t_i)=\sum_{j=1}^i x_j ~ \forall ~ i=1\dots m-1 \right) =[\text{repeat this procedure}]=p_{x_1,x_1+x_2}(t_1,t_2)p_{x_1+x_2,x_1+x_2+x_3}(t_2,t_3)\dots p_{x_1+\dots +_{m-1},x_1+\dots+x_m}(t_{m-1},t_m)P(N(t_1)=x_1)= \left[ \prod\limits_{j=1}^{m-1} p_{\sum\limits_{k=1}^j x_k, \sum\limits_{k=1}^{j+1} x_k}(t_j,t_{j+1}) \right] P(N(t_1)=x_1)= \left[ \prod\limits_{j=1}^{m-1} \frac{(\lambda(t_{j+1}-t_j))^{x_{j+1}}}{x_{j+1}!} e^{-\lambda(t_{j+1}-t_j)} \right] \frac{(\lambda t_1)^{x_1}}{x_1!} e^{-\lambda t_1}=\prod\limits_{i=1}^m P(N(t_i)-N(t_{i-1})=x_i) $$

5
On

You must show that $P(N(t)=i+j \mid N(s)=i) = P(N(t)-N(s)=j) \ \forall \ t,s \ge 0$

$P(N(t)=i+j \mid N(s) = i) = \cfrac{(\lambda(t-s))^j}{j!}e^{-\lambda(t-s)}$

$P(N(t)-N(s)=j) = P(N(t)=j+N(s)) = \sum\limits_{i=0}^\infty P(N(t)=j+i\mid N(s)=i)P(N(s)=i) = \sum\limits_{i=0}^\infty \cfrac{(\lambda(t-s))^j}{j!}e^{-\lambda(t-s)}\cfrac{(\lambda s)^i}{i!}e^{-\lambda s} = \cfrac{(\lambda(t-s))^j}{j!}e^{-\lambda(t-s)} e^{-\lambda s}\sum\limits_{i=0}^\infty \cfrac{\lambda s^i}{i!} = \cfrac{(\lambda(t-s))^j}{j!}e^{-\lambda(t-s)} \qquad \square$