transition matrix of two players - Markov Chain

322 Views Asked by At

Suppose now that two players ($\textbf{A}$ and $\textbf{B}$), each having \$2, agree to keep playing the game and betting \$1 at a time until one player is broke. The probability of $\textbf{A}$ winning a single bet is 1/3, so $\textbf{B}$ wins the bet with probability 2/3. The number of dollars that player A has before each bet (0, 1, 2, 3, or 4) provides the states of a Markov chain. i.e., states are 0, 1, 2, 3, 4.

I know the transition matrix which is given here as follows: \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ 2/3 & 0 & 1/3 & 0 & 0 \\ 0 & 2/3 & 0 & 1/3 & 0 \\ 0 & 0 & 2/3 & 0 & 1/3 \\ 0 & 0 & 0 & 0 & 1 \\ \end{bmatrix}

Now my question is: how can I construct this transition matrix? I don't understand whether it is unique or not. Thanks in advance!

1

There are 1 best solutions below

2
On BEST ANSWER

The entry in row $i$ column $j$ is $p_{ij}$, the probability that if the state is in state $i$ at step $n$ it will be in state $j$ at time $n+1$. So if the chain is in state $0$, it will remain in state $0$, because $\mathbf{A}$ is broke and the game is over. So, $p_{00}=1$ and $p_{0i}=0$ for $i\neq0$. That takes care of the first row. The last row is similar, because the game also ends if $\mathbf{B}$ is broke.

Each of the other rows express the fact that $\mathbf{A}$'s bankroll increases by $1$ with probability $\frac13$ or decreases by $1$ with probability $\frac23$, and there are no other possibilities.