I'm struck with an exercise. I tried, but the results don't seem to fit to those proposed.
Exercise: Two players play the following game. The one who begins draws two cards from a deck of 40 cards (10 cards per suit):
- if they are both clubs the player wins;
- if they are of the same suit but not clubs, the player shuffles the cards and start again;
- else the player shuffles the cards and let the other player play.
1) Model the game with a Markov Chain.
2) What is the probability that who started wins?
My attempt: If $X_n$ is the player in the current turn, we have a three state MC whose state space is $\{A,\, B,\, \text{exit}\}$. The exit state is the one reached when the game ends. The transition matrix is
$$P=\begin{pmatrix}9/52 & 10/13 & 3/52 \\ 10/13 & 9/52 & 3/52 \\ 0 & 0 & 1 \end{pmatrix}$$
since the probability of staying is $\frac{3 \cdot 9}{4 \cdot 39}$ and so on.
For the second question I thought I could use the Strong Markov Property (the one which states that a Markov Chain can start afresh in a Stopping Time) by using the last time I see A (or B) before the game ends. Starting from that point, the probability is just the jump from A (or B) to exit times two (to consider both the A and the B case).
What's wrong with this last point?
First, I wouldn't modelise the markov chain like that , I would consider four states 1 = "player A play", 2 = "player B play", 3 = "player A had won" and 4 = "player B had won"
The transition matrix would be
$$M = \begin{pmatrix} 9/52 & 10/13 & 3/52 & 0 \\ 10/13 & 9/52 & 0 & 3/52 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{pmatrix}$$
Then the probability to be at each state after n step knowing that we started in state 1) is given by
$$\pi^{(n)} = (1,0,0,0)M^n$$
And the probability we're looking for is
$$p = \lim_{n\to +\infty} \pi_3^{(n)}$$
To calculate $M^n$, you can diagonalize it, and it should give you the answer (I didn't do the calcul)