2 Questions about Markov chain

396 Views Asked by At

The first question: $ (X_n : n = 1, 2, ...) $ is a Markov chain with state space $(-1, 0, 1)$.

  1. $(sin(X_n) : n = 1, 2, ...$) is a Markov chain.
  2. $(cos(X_n) : n = 1, 2, ...$) is a Markov chain.
  3. $(|X_n| : n = 1, 2, ...$) is a Markov chain.
  4. $(X_n^2 : n = 1, 2, ...$) is a Markov chain.
  5. None of the above is correct.

Is it option 5 is the correct answer since option 1&2 are continuous, option 3&4 cannot be negative value, so that the state space cannot be -1?

The second question: Suppose we have three random variables $X_0, X_1$ and $X_2$ forming a Markov process with time domain $(0, 1, 2)$. Then, which of the following is correct.

  1. $X_1,X_2,X_0$ also form a Markov process.
  2. $X_2,X_0,X_1$ also form a Markov process.
  3. $X_0,X_2,X_1$ also form a Markov process.
  4. $X_2,X_1,X_0$ also form a Markov process.
  5. None of the above is correct.

Is it option 4 is the correct answer since it is a one-step transition?

1

There are 1 best solutions below

1
On

I'm not sure you understand the questions being asked. In the first question, you are given the markov chain $(X_0, X_1, \dots)$, with a given state space $\{ -1, 0, 1 \}$, and are asked to compute whether one of four different Markov chains is correct, which may have different state spaces.

The trick is to think in the following manner. A chain is Markov if, given knowledge at a given time, I can't predict the future better if I know knowledge about the task.

I'll answer part of question 1 for you, and leave the rest for you to calculate. First, consider the chain

$$ \sin(X_0), \sin(X_1), \dots $$

Since the $X_i$ take values in $\{ -1, 0, 1 \}$, this chain takes values in $\{ \sin(-1), \sin(0), \sin(1) \}$. To determine whether this chain is Markov, we must use our knowledge that the $X_i$ are Markov. The trick to notice is that this new chain is no different from the last chain. We have simply swapped our observations for different numbers. For a contradiction, suppose that the next chain was not Markov. That is, we can learn more about the future when we know $\sin(X_i)$ and $\sin(X_j)$ then when we just know $\sin(X_j)$, for $i < j$. Then we may predict the future by knowing both $X_j$ an $X_i$. Formally, we must verify the Markov condition, which we may do with the equation

$$ \mathbf{P}(\sin(X_{i+1}) = x | \sin(X_i) = y, \sin(X_{i-1}) = z) = \mathbf{P}(\sin(X_{i+1}) = x | \sin(X_i) = y) $$

This follows because

$$ \mathbf{P}(\sin(X_{i+1}) = x | \sin(X_i) = y, \sin(X_{i-1}) = z)$$ $$= \mathbf{P}(X_{i+1} = \sin^{-1}(x) | X_i = \sin^{-1}(y), X_{i-1} = \sin^{-1}(z)) $$ $$= \mathbf{P}(X_{i+1} = \sin^{-1}(x) | X_i = \sin^{-1}(y))$$ $$= \mathbf{P}(\sin(X_{i+1}) = x | \sin(X_i) = y)$$

Now the chain $\{ \cos(X_i) \}$ turns out to be a little bit of a problem, since we are not really renaming our observations. $\cos(-1) = \cos(1)$. Denote this value by $\phi$. Suppose that we are watching a sequence, where we flip a coin to determine whether we begin at -1 or 1, and then observe the deterministic sequence which follows

$$ -1, 1, 0, -1, 1, 0, -1, 1, 0$$

The next state of the sequence is determined precisely from the past state, so this random chain is markov. Nonetheless, when we push the values through $\cos$, we obtain the sequence

$$ \phi, \phi, 0, \phi, \phi, 0, \phi, \phi, 0 $$

This random chain is no longer Markov, since if we are at $\phi$, and we know we saw a $\phi$ directly before, then we know exactly what the next state is, whereas if we only know we are at $\phi$, the next state could be a $\phi$ or a 0.

For practice and understanding, it's best if you do the rest of the examples yourself.