Independent Increments have the Markov Property Proof

2.5k Views Asked by At

The proof in the book goes something like this: for all times $s_1 < s_2 < s_3 < ...< s < t$ and all states $x_1, x_2,....,x_n$ and $x$ in $S$ the State Space and all subsets of $A$ of $S$.

P[$X_t$ $\in$ $A$ | $X_{s_1}$ = $x_1$, $X_{s_2}$ = $x_2$,...., $X_{s_n}$ = $x_n$, $X_s$ = $x$]

= P[$X_t$ - $X_s$ + $x$ $\in$ $A$ | $X_{s_1}$ = $x_1$, $X_{s_2}$ = $x_2$,...., $X_{s_n}$ = $x_n$, $X_s$ = $x$]

= P[$X_t$ - $X_s$ + $x$ $\in$ $A$ | $X_{s}$ = $x$]

= P[$X_t$ $\in$ $A$ | $X_{s}$ = $x$]

This is the proof that independent increments have the markov property but can someone make explicit the reason step 1 to 2 simply deletes all the $X_{s_1}$ = $x_1$, $X_{s_2}$ = $x_2$,...., $X_{s_n}$ = $x_n$, $X_s$ = $x$. I thought independence meant $P(A,B)=$ $P(A).P(B)$. What is meant by independence in here? A simple explanation would be great.

1

There are 1 best solutions below

3
On

Independence here cannot really be compared with the classical notion of stochastic independence. The Markov property states in your setting that $$P[X_t \in A | X_{s_1} = x_1, X_{s_2} = x_2,...., X_{s_n} = x_n, X_s = x] = P[X_t \in A | X_{s} = x].$$ Which can be translated to: Whatever happened to the Markov chain before time point $s$ cannot be used to deduce what will happen at time point $t$ or: The Markov chain is a random process that only sees the point it is standing at the moment, when going to the next point.

The event $X_t \in A$ can be replaced by $X_t - X_s + x \in A$. $X_t - X_s$ is now one of the stochastically independent random increments, which makes it especially independent of $X_{s_1},...X_{s_n}$, but not of $X_s$, since $X_s$ clearly appears in $X_t - X_s$. A standard statement in conditional probability is: $P(X \in \cdot | Y=y) = P(X \in \cdot)$, if $X,Y$ are stochastically independent.