Suppose that $A$ tosses a coin which lands heads with probability $p_A$ and $B$ tosses a coin which lands heads with probability $p_B$. They toss their coins simultaneously over and over again, in a competition to see who gets the first head. The one to get the first head is the winner, except that a draw results if they get their first heads together.
Calculate P(A wins).
Let $X_A$ be the # of tosses that it takes $A$ to get the first head, $X_A \ge 1$.
Let $X_B$ be the # of tosses that it takes $B$ to get the first head, $X_B \ge 1$.
$P(A $ wins$)$
$= P(X_A \lt X_B)$
$=\sum_{i=1}^{\infty}P(X_A<X_B|X_A=i)P(X_A=i)$
$=\sum_{i=1}^{\infty}(1-p_B)^i(1-p_A)^{i-1}p_A$
$=\frac{p_A}{1-p_A}\sum_{i=1}^{\infty}(1-p_B)^i(1-p_A)^i$
$=\frac{p_A}{1-p_A}\sum_{i=1}^{\infty}((1-p_B)(1-p_A))^i$
$=\frac{p_A}{1-p_A}\left(\sum_{i=0}^{\infty}((1-p_B)(1-p_A))^i - ((1-p_B)(1-p_A))^0\right)$
$=\frac{p_A}{1-p_A}\left(\frac{1}{1-(1-p_A)(1-p_B)}-1\right)$
Textbook Answer:
$\frac{(1-p_A)p_B}{1−(1−p_A )(1−p_B )}$
They're not the same, I tested with $p_A = 0.2$, $p_B=0.5$
Literally had to search every corner of the internet for this solution. I'll just leave it here.