I am trying to compute a very specific probability given a game of Yahtzee.
Here are the rules:
- There are 5 fair, 5-sided dice with equal probability of rolling each number.
- You have 3 rolls to obtain a straight [1,2,3,4,5].
- Per Yahtzee rules, you can choose which dice to reroll, in order to obtain a straight. (ie, if on roll 1 you get [1,2,3,3,4], you can keep [1,2,3,4] and reroll 1 dice)
- Your first roll must have a 3 of a kind (ie: [1,2,3,3,3] or [1,1,3,3,3]
- First rolls with 4 of a kind, 5 of a kind, or a perfect straight DO NOT count (ie: [3,3,3,3,4])
I am trying to write this in terms of mathematic notation. My intuition is telling me that this is a conditional probability, where
$P(C)$ = probability that we get a 3 of a kind on roll 1
$P(A)$ = probability that we roll a straight in 3 rolls
$P(A \mid C)$ = probability that we roll a straight in 3 rolls, given we get a 3 of a kind on roll 1
$$P(A \mid C) = \frac{P(A {\cap} C)}{P(C)}$$
I am worried my intuition is incorrect. I ran simulations (verified correct, 20000 trials) that say the probability of getting 3 of a kind on the first roll is roughly 20.06%, and the probability of getting a straight, given you get 3 of a kind on the first roll, is roughly 4.34%.
However, plugging in these values to my conditional probability results in:
$$P(A \mid C) = \frac{0.0434}{0.2006} = 0.2163$$
I am confused - isn't $P(A \mid C)$ the probability of getting a straight in 3 rolls, given you roll a 3 of a kind on the first try?
Any guidance would be very helpful - I wrote code simulations first, and now need to write a general verification of my result (0.0434) by hand.
Thanks!