Is this sequence the Markov chain?

166 Views Asked by At

Let $\xi_n, n\geq 1$ be a sequence of discrete random variables, $\xi_n \in A=\{1,2,...,K\}, \forall n$. This sequence satisfies the following condition: $$P(\xi_n=i_1|\xi_m=i_2,\xi_k=i_3)=P(\xi_n=i_1|\xi_m=i_2), \forall n>m>k, i_1,i_2,i_3 \in A.$$ The question: Will this sequence be a Markov chain?
Any suggestion or example are appreciated. Thank you.

1

There are 1 best solutions below

7
On BEST ANSWER

Assuming $\xi_n$ is adapted to the filtration $F_n$, the Markov Property is given by \begin{align} \mathbb{P} \left( \xi_n = i_1 \mid F_{m} \right) = \mathbb{P} \left( \xi_n = i_1 \mid \xi_{m} \right), \quad \forall n>m, i_1 \in A \end{align} which intuitively means that the only information that is relevant for future jumps is the most recent information. In your example, the Markov Property is \begin{align} \mathbb{P} \left( \xi_n = i_1 \mid \xi_m = i_2, \xi_{m-1}=i_3, \dots, \xi_1=i_{m+2} \right) = &\mathbb{P} \left( \xi_n = i_1 \mid \xi_{m} =i_2 \right), \\ \forall n>m, i_1 \dots i_{m+2} \in A. \end{align} However, this is not the property that you were given. So the sequence $(\xi_n)_{n \geq 1}$ is not Markov with the information you were given.
Imagine that information about 3 consecutive values of $\xi$, $\xi_{n},\xi_{n-1},\xi_{n-2}$, would change the jump probabilities for $\xi_{n+1}$ compared to just having the knowledge of $\xi_n$: \begin{align} \mathbb{P} \left( \xi_{n+1} = i_1 \mid \xi_n = i_2, \xi_{n-1}=i_3,\xi_{n-2}=i_{4} \right) \neq &\mathbb{P} \left( \xi_{n+1} = i_1 \mid \xi_{n} =i_2 \right), \\ \forall n, i_1, i_2, i_3, i_{4} \in A. \end{align} This example is a sequence that would not have the Markov Property, but could still have the property you were given.

EDIT:
Here is a more concrete example: Let $ \left(X_n \right)_{n \geq 0}$ be a sequence of discrete random variables, $X_n \in A = \{ 1,2,\dots,K \} \forall n$, where \begin{align} \mathbb{P} \left(X_{n+1} =a+1 \mid X_n = a \right) &= \frac{2}{3},\quad \forall n, a \in \{ 1,2,\dots,K-1 \} \\ \mathbb{P} \left(X_{n+1} =a \mid X_n = a \right) &= \frac{1}{3},\quad \forall n, a \in \{ 1,2,\dots,K \} \\ \mathbb{P} \left(X_{n+1} =1 \mid X_n = K \right) &= \frac{2}{3},\quad \forall n \\ \mathbb{P} \left( X_{n} = a_1 \mid X_{m} =a_2 , X_k = a_3 \right) &= \mathbb{P} \left( X_{n} = a_1 \mid X_{m} =a_2 \right) \forall n, a_1, a_2, a_3 \in A \end{align} So far, this looks like a Markov Chain, and if no further information about sequence is given, it would be fair to assume it is a Markov Chain (ie. assuming all unspecified probabilities are $0$). Notably, it has the desried property that is given in your problem. However, if it was futhermore given that \begin{align} \mathbb{P} \left( X_{n+1} = 1 \mid X_{n} =a , X_{n-1} = a, X_{n-2} =a \right) &=1 \quad \forall n, a\in A, \end{align} the sequence cannot have the Markov Property (and thus cannot be a Markov Chain) despite having the property given in the problem. This is due to \begin{align} \mathbb{P} \left( X_{n+1} = 1 \mid X_{n} =a , X_{n-1} = a, X_{n-2} =a \right) \neq \mathbb{P} \left( X_{n+1} = 1 \mid X_{n} =a \right) \quad \forall n, a\in A, \end{align} since we would need an equality in the equation above.