I'm trying to find the probability transition matrix in this Markov's chain problem. Three black and three white balls are distributed between two polls, in a way that each poll contains three balls. In each iteration one ball is extracted from each poll and put in the other one. Let $X_n$ be the number of white balls in one of the polls after $n$ iterations.
Thanks in advance for your help
Let $X_n$ be the number of white balls in poll $1$. Then $X_n\in\{0,1,2,3\}$. The probability transition matrix is
$$ \begin{bmatrix} 0 & 1 & 0 & 0 \\ 1/9 & 4/9 & 4/9 & 0 \\ 0 & 4/9 & 4/9 & 1/9 \\ 0 & 0 & 1 & 0 \\ \end{bmatrix} $$
How is this derived? Take the entries in row $2$, for example, and let WB denote the event "choose White from poll $1$ and Black from poll $2$" and so on. Then,
\begin{eqnarray*} P(X_n=0\mid X_{n-1}=1) &=& P(\text{WB}) = \dfrac{1}{3}\cdot\dfrac{1}{3} = \dfrac{1}{9} \\ && \\ P(X_n=1\mid X_{n-1}=1) &=& P(\text{WW} \cup \text{BB}) = \dfrac{1}{3}\cdot\dfrac{2}{3} + \dfrac{2}{3}\cdot\dfrac{1}{3} = \dfrac{4}{9} \\ && \\ P(X_n=2\mid X_{n-1}=1) &=& P(\text{BW}) = \dfrac{2}{3}\cdot\dfrac{2}{3} = \dfrac{4}{9} \\ && \\ P(X_n=3\mid X_{n-1}=1) &=& 0. \end{eqnarray*}