Does markovian property imply independence?

1.6k Views Asked by At

A stochastic process has the Markov property if the conditional probability distribution of future states of the process (conditional on both past and present states) depends only upon the present state, not on the sequence of events that preceded it. A process with this property is called a Markov process.

Is the google definition.

My book defines:

$Pr(X_{i+1}<x | X_i = x_i,X_{i-1}=x_{i-1},...,X_0=x+0) = Pr(X_{i+1}<x | X_i = x_i)$

Does this mean that:

  1. X_{i+1} is independent of $X_{i-1}$ etc. or

  2. does this mean that $X_{i}$ contains all the information of $X_{i-1}$,$X_{i-2}$ etc?

2

There are 2 best solutions below

2
On

The Google definition states that the probability that the $i+1$ random variable has a value less than some $x$ given all the previous values is equal to the probability given only the single previous value.

0
On

$$\mathsf P(X_{i+1}<x \mid X_i=x_i,X_{i-1}=x_{i-1},\ldots, X_0=x_0) ~=~\mathsf P(X_{i+1}<x\mid X_i=x_i)$$

For events $F,C,P$, the expression $\Pr(F\mid C, P)=\Pr(F\mid C)$ means that $F,P$ are conditionally independent given $C$.

In this example you have events for the future state, current state, and all prior states:

  • $F = \{X_{i+1}<x\}$
  • $C = \{X_i=x_i\}$
  • $P = \{X_{i-1}=x_{i-1},\ldots, X_0=x_0\}$

So, it follows from the Markov property: an event for a furture state is conditionally independent of values for all prior states when given a value for the current state.