Let $A_n$ be a Markov chain $(\lambda,P)$ on a state space $I$, with stationary distribution $\pi$. Let $B_n$ be a Markov$(\pi,P)$ chain, and independent of $A_n$ on the same state space $I$. Prove $C_{n}=(A_{n},B_{n})$ is a Markov chain on $I \times I$.
Proof: Here's my attempt anyway: Proof:
First note that if $c_{n} \in C_{n}$, then $c_{n}=(a_{n},b_{n})$, where $a_n \in A_{n}$ and $b_{n} \in B_{n}$.
$$ P(C_{n+1}=c_{n+1} \mid C_{0}=c_{0},..C_{n}=c_{n}) $$ Rewriting: \begin{align*} &P((A_{n+1},B_{n+1})=(a_{n+1}, b_{n+1})\mid (A_0, B_0)=(a_0,b_0),..(A_{n},B_{n})=(a_n,b_n))\\ &=P(A_{n+1}=a_{n+1}, B_{n+1}=b_{n+1} \mid A_{0}=a_{0},..A_{n}=a_n, B_{0}=b_{0},..B_{n}=b_{n})\\ &=P(A_{n+1}=a_{n+1} \mid A_{0}=a_{0},...A_{n}=a_n) \cdot P(B_{n+1}=b_{n+1} | B_{0}=b_{0},...B_{n}=b_n)\\ &=P(A_{n+1}=a_{n+1} \mid A_{n}=a_{n}) \cdot P(B_{n+1}=b_{n+1} \mid B_{n}=b_{n})\\ &=P(A_{n+1}=a_{n+1}, B_{n+1}=b_{n+1} \mid A_{n}=a_{n}, B_{n}=b_{n})\\ &=P((A_{n+1},B_{n+1})=(a_{n+1},b_{n+1}) \mid (A_{n}, B_{n})=(a_{n},b_{n}))\\ &=P(C_{n+1}=c_{n+1} \mid C_{n}=c_{n}) \end{align*} Hence is a Markov chain.
I'm not sure if $\textbf{second equals sign}$ and the $\textbf{fourth equals sign}$ is correct?
i.e.: Can I say $P(A \cap B \mid C \cap D)=P(A \mid C) \cdot P(B \mid D)$? by independence?