If a Markov chain depends on its two previous states?

57 Views Asked by At

I'm currently working on a project related to the Markov chain, but this Markov chain is non-standard, $X_t$ depends on its two previous states $X_{t-2}, X_{t-1}$. In a standard Markov chain, $X_t$ only depends on its previous state $X_{t-1}$, and we can express $X_t$ easily.

How to start analyzing if a state $X_t$ depends on its two previous states $X_{t-2}, X_{t-1}$. Can this non-standard Markov chain maintain the same properties as a standard Markov chain to express the Probability density in t.

How to express the probability of state k? Is there any relevant literature studying this issue?

1

There are 1 best solutions below

2
On BEST ANSWER

The above chain is basically still a Markov chain. Consider the sequence $$ Z_t:=\left(\begin{array}1 X_t\\X_{t-1}\end{array}\right). $$ I assume that the chain $X_t$ is parametrized by $t\geq 0$. This is a vector-valued sequence of random variables. Since $X_t$ ‘depends’ on the previous two states, $Z_t$ depends only on $Z_{t-1}$ (I am using this word to describe the Markov property, I think you get what I mean because you also used that).

With this definition, $Z_0$ is not defined. If $X_0$ and $X_1$ are any two random variables with no particular relation one with the other (or if they are constant, etc.), you simply start your chain with $Z_1$ as your initial state and everything works perfectly.