There is a game where you are asked to roll two fair six-sided dice. If the sum of the values equals 7, then win £21. However, must pay £5 to play each time both dice are rolled. Do you play this game?
One way to think about this is that getting a 7 comes with 1/6 chance, and to make money we need to get 7 at a rate of 1/4, so the answer is not to play.
Another way to think about it is: what is my chance of throwing a 7 at least once in every 4 throws? In which case I would calculate a probability of not throwing a 7 4 throws in a row (5/6)^4, and then subtract this from 1 to get a probability of throwing at least one 7. Which is 1 - (5/6)^4 = 0.52. By this logic I would play the game.
Both of these answers cannot be correct. Could someone explain to me which one is incorrect and why? Thanks!
EDIT: wow, this is the first time I asked a question on StackOverflow, did not expect to get so many responses. Thank you all, I am very grateful!
You are correct about the probabilities involved here. The probability of rolling a total of $7$ on a pair of fair dice is indeed $1/6$. Likewise, the probability of rolling a $7$ at least once in four rolls is $1-\left(\frac{5}6\right)^4$ which is about $52\%$ - and this might equally well be interpreted as "the probability of earning money after four repetitions of the game". We have to be careful of how we interpret such quantities, however.
Imagine the following variation of the game, to which the logic of your second answer applies:
Or, equivalently, by looking at the net gains for either outcome:
While it's true that you will win this game $52\%$ of the time, it's probably not a game you want to play - it's basically flipping a coin between "earn $£1$" and "lose $£20$". Your second argument posits a world where we'd play this game because we'll probably win - but it ignores that the consequence of losing far outweighs the benefit of winning. This game is more pessimistic than the original one (since it doesn't allow you to win multiple prizes), but more cleanly illustrates why your second reasoning is misleading*.
The typical way to evaluate this sort of game rigorously is by looking at expected values rather than probabilities. Expected value is the answer to the following question:
This can be computed by multiplying the possible outcomes by the respective probabilities - in the case of this problem, you have a $1/6$ chance of earning a net of $£16$ and a $5/6$ chance of earning $-£5$ (i.e. losing $£5$). You can calculate the expected value as: $$\frac{1}6 \cdot £16 - \frac{5}6 \cdot £5 = -£1.50$$ Showing that you expect to lose $£1.50$ in an average round of this game - meaning it's not worth playing. We could also rewrite this as $$\frac{1}6\cdot £21 - 1\cdot £5 = -£1.50$$ to capture the equivalent idea that we earn $£21$ with some probability, but always must pay $£5$ to enter - and this expression aligns well with your intuition: the amount we expect to be rewarded with (after paying to play) is $£21$ times the probability that it occurs - and for this to outweigh the entrance fee, it needs to happen about a quarter of the time, as you correctly say.
You could also imagine this as playing the game six times - and considering that you expect to win once (earning $£21$) but pay $£30$ to enter, leaving a loss of $£9$ - which is just six times the previously calculated loss for a single game.
As a matter of curiosity not directly implicated in your question: these numbers are somewhat related. Your second calculation can be described as:
This turns out to be greater than $50\%$, but you could imagine asking for a different probability such as:
I won't go into details, but this probability comes out to only $3\%$ - and if you play $200$ rounds, the probability of coming out ahead drops to $0.4\%$ and all the way down to $0.07\%$ after $300$ rounds. As a general rule, if you play a game with negative expected value, the probability of coming out ahead decreases at an exponential rate with the number of games played. The fact that you're more likely than not to come out ahead after $4$ games is true but misleading - the situation only gets worse with the more games played, which is one of the things that the expected value tells you (and one reason why expected value is a very good tool for evaluating games of chance).
*Note: I say "misleading" rather than "wrong" because it is a correct calculation of something - but it'd be a stretch to conclude from that probability that this would be a good game to play if you were just walking down the street. However, there are natural contexts where the second kind of calculation is useful - for instance, in the last round of many board games or card games, you might be able to directly calculate a probability of a certain move causing you to win the game via similar logic - where the expected value calculation of "what would happen if I did this a bunch and averaged it" isn't actually relevant (and might lead to different - and therefore incorrect - results). That said, expected value is usually the right thing to think about except when you have a compelling reason to consider something else.