Proving Markov property for $Z_n =1_{\{X_n=G\}}+1_{\{Y_n=G\}} $

55 Views Asked by At

Let $X_n,Y_n $ be two homogenous independent Markov processes that take two values $\left\{ G,B\right\} $.

The probability that $X_n $ changes state is $p$ and the probability that $X_n$ stays the same is $1-p$.

The probability that $Y_n $ changes state is $q$ and the probability that $Y_n$ stays the same is $1-q$.

Defining a new process $$Z_{n}=1_{\{X_{n}=G\}}+1_{\{Y_{n}=G\}} $$ for $p=q$, prove that $Z_n$ is a Markov process.

The process $Z_n$ takes on values from ${0,1,2}$. I tried working with the definition: $$ P\left(Z_{n}=\alpha_{n}|Z_{n-1}=\alpha_{n-1},\dots Z_{1}=\alpha_{1}\right)= \\ P\left(1_{\{X_{n}=G\}}+1_{\{Y_{n}=G\}}=\alpha_{n}|1_{\{X_{n-1}=G\}}+1_{\{Y_{n-1}=G\}}\dots Z_{1}=1_{\{X_{1}=G\}}+1_{\{Y_{1}=G\}}\right) $$ I don't know how to use the independency property because the conditional probability involves sums and involve $q,p$ to prove it follows the Markov property when $p=q$.

1

There are 1 best solutions below

0
On BEST ANSWER

Hints

  • The product of two independent Markov chains is also a Markov chain.
  • $\ \big\{Z_n=0\big\}= \big\{\big(X_n,Y_n)=(G,G)\big\}\ ,$ $\ \big\{Z_n=2\big\}= \big\{\big(X_n,Y_n)=(B,B)\big\}\ ,$$\ \big\{Z_n=1\big\}= \big\{\big(X_n,Y_n)=(G,B)\big\}\cup\big\{\big(X_n,Y_n)=(B,G)\big\} $
  • When $\ p=q\ $, \begin{align}P\big( Z_{n+1}=i\,\big|\,&\big(X_n,Y_n)=(B,G)\,\big)\\&=P\big( Z_{n+1}=i\,\big|\,\big(X_n,Y_n)=(G,B)\,\big)\\ &=\cases{p(1-p)&for $\ i=0,2$\\p^2+(1-p)^2&for $\ i=1$} \end{align} The key here is that that because of the identity in the third dot point above, the two states $\ (B,G)\ $ and $\ (G,B)\ $ are equivalent, and therefore when you amalgamate them into a single state, the result is still a Markov chain.