Solution using Markov chains

43 Views Asked by At

Two players A and B flip a coin. A starts; the winner is whoever gets heads first.
This problem is easily solved via geometric series yielding $P(A)=\frac{2}{3}$. I wanted to also solve this using Markov Chains, yet is unclear to me how to integrate the advantage of starting. Any help on this matter? (With Markov Chains I do not mean the matrix version, but the plain-old version of solving a small system of linear equations.)

1

There are 1 best solutions below

1
On BEST ANSWER

Just notice that if neither $A$ nor $B$ win on first try, the game starts anew. $$ \begin{align*} P[A\ \text{wins}] &= P[A\ \text{wins on first try}] + P[\text{$A$ and $B$ don't win on first try}] \cdot P[A\ \text{wins}] \\ &= \frac12 + \frac14 \cdot P[A\ \text{wins}]. \end{align*} $$