Is vector of Markov chains a Markov chain?

530 Views Asked by At

Suppose $\{X_{n}\}$ and $\{Y_{n}\}$ are integer-valued Markov chains indexed on nonnegative integers $n$. Is $(X_{n},Y_{n})$ a Markov chain? If not, is it a Markov chain when $\{X_{n}\}$ and $\{Y_{n}\}$ are independent collections of random variables?

2

There are 2 best solutions below

1
On BEST ANSWER

A Markov chain has to satisfy the Markov property. If you define $Y_n = X_{n-2}$, then clearly the state of $(X_n,Y_n)$ does not merely depend on $(X_{n-1},Y_{n-1})$. So indeed you need that $\{X_n\}$ and $\{Y_n\}$ are independent.

0
On

If $\ X_n\ $ and $\ Y_n\ $ are independent Markov chains then $$ \mathrm{P}\left( \left(X_t, Y_t\right) = \left(x_t, y_t\right)\,\left\vert\, \bigcap_{i=0} ^{t-1}\left\{\left(X_i, Y_i\right)=\left(x_i,y_i\right)\right\}\right.\,\right)\\ = \frac{\mathrm{P}\left(\bigcap_{i=0}^t\left\{\left(X_i, Y_i\right)=\left(x_i,y_i\right)\right\}\right)}{\mathrm{P}\left(\bigcap_{i=0}^{t-1}\left\{\left(X_i, Y_i\right)=\left(x_i,y_i\right)\right\}\right)}\\ = \frac{\mathrm{P}\left(\bigcap_{i=0}^t\left\{X_i=x_i\ \right\}\right)\mathrm{P}\left(\bigcap_{i=0}^t\left\{Y_i=y_i\ \right\}\,\right)}{\mathrm{P}\left(\bigcap_{i=0}^{t-1}\left\{X_i=x_i\ \right\}\right)\mathrm{P}\left(\bigcap_{i=0}^{t-1}\left\{Y_i=y_i\ \right\}\,\right)}\\ = \mathrm{P}\left(\,X_t=x_t\,\left\vert\,\bigcap_{i=0}^{t-1}\left\{X_i=x_i\right\}\right.\right)\mathrm{P}\left(\,Y_t=y_t\,\left\vert\,\bigcap_{i=0}^{t-1}\left\{Y_i=y_i\right\}\right.\right)\\ = \mathrm{P}\left(\,X_t=x_t\left\vert\,X_{t-1}=x_{t-1}\right.\right)\mathrm{P}\left(\,Y_t=y_t\left\vert\,Y_{t-1}=y_{t-1}\right.\right)\\ = \frac{\mathrm{P}\left(\,\left\{\,X_t=x_t\,\right\} \cap \left\{\,X_{t-1}=x_{t-1}\right\}\right)\mathrm{P}\left(\,\left\{\,Y_t=y_t\,\right\}\cap\left\{\,Y_{t-1}=y_{t-1}\right\}\right)}{\mathrm{P}\left(\,X_{t-1}=x_{t-1}\right)\mathrm{P}\left(\,Y_{t-1}=y_{t-1}\right)}\\ = \frac{\mathrm{P}\left(\,\left\{\,X_t=x_t\,\right\} \cap \left\{\,X_{t-1}=x_{t-1}\right\}\cap\left\{\,Y_t=y_t\,\right\}\cap\left\{\,Y_{t-1}=y_{t-1}\right\}\right)}{\mathrm{P}\left(\,\left\{\,X_{t-1}=x_{t-1}\right\}\cap \left\{\,Y_{t-1}=y_{t-1}\right\}\right)}\\ = \mathrm{P}\left( \left(X_t, Y_t\right) = \left(x_t, y_t\right)\,\left\vert\,\left(X_{t-1}, Y_{t-1}\right)=\left(x_{t-1},y_{t-1}\right)\right.\,\right)\ . $$ That is, $\ \left(X_n, Y_n\right)\ $ is a Markov chain. In the above chain of equations, lines 2, 4, 6 and 8 follow from the definition of conditional probabiity, lines 3 and 7 from the independence of $\ X_n\ $ and $\ Y_n\ $, and line 5 from their Markovity.