Addition of probabilities and gambler's fallacy

1.3k Views Asked by At

Say you have a 1 in 6 chance of winning a card game. The more times you play, the higher the odds of you winning.

$$P(\text{win over 1 trial}) = 1/6 \\ P(\text{win over 2 trials}) = 1/6 + 1/6 \\ ... \\ P(\text{win over n trials}) = n/6$$

From this, you can deduce you'd have to play six times in order to have won once. However, this isn't the case. Even though these events are in a series, the odd always remain $1/6$; there is indeed a chance you won't win even after 6 trials, 10 trials, 100 trials, $\infty$ trials, etc.

Provided the odds of winning one game is $1/6$, how many games would you have to play to ensure you've won at least once? I think what I'm asking is the trial where the confidence of winning has the widest range.

How does the addition of probabilities take into account true randomness?

3

There are 3 best solutions below

0
On BEST ANSWER

I think there is a problem with your calculation of probability. One way of thinking about your question is: "After playing $n$ times what are the odds I have still not won anything?". Chaining events is done by multiplying the probabilities. So, $$ \begin{array}{lll} P(\text{loss over 1 trial}) = 5/6&\implies& P(\text{some wins over 1 trial}) = 1 -5/6 = 1/6\\ P(\text{loss over 2 trial}) = (5/6)^2&\implies& P(\text{some wins over 2 trials}) = 1 -(5/6)^2 = 1/4\\ \end{array} $$ As you can guess, this generalizes for any $n$ so $$ P(\text{some wins over $n$ trials}) = 1 -(5/6)^n $$ So, if you play forever, then the limit of the probability that you win something is $\lim_{n\to\infty} 1 -(5/6)^n = 1$. Unfortunately, there is never any way of guaranteeing a win (ever) for a finite number of trials. If you want 50% chance of winning something, you can solve for $n$: $$ 1 -(5/6)^n = \frac{1}{2} \implies n = \frac{\log 2}{\log 6 - \log 5} \approx 3.802 $$ So if you play 4 times odd are in your favor to win something. :)

0
On

Let $A$ and $B$ be events in some sample space. It is true that $P(A\text{ or } B)=P(A)+P(B)$ if and only if $A$ and $B$ are disjoint: that is, as long as the can't both happen. So, if you roll a die and denote the outcome by $X$, $P(X=2)=\frac{1}{6}$ and $P(X\text{ is odd})=\frac{3}{6}$, and since the die can't come up both 2 and odd at the same time, we can say $$\begin{align}P(X=2\text{ or }X\text{ is odd})&=P(X=2)+P(X\text{ is odd })\\&=\frac{1}{6}+\frac{3}{6}\\&=\frac{4}{6}\end{align}$$

However, suppose we roll the die twice. Denote by $X_1$ the outcome of the first roll and $X_2$ the outcome of the second roll. Again it is true that $P(X_1=2)=\frac{1}{6}$, and that $P(X_2\text{ is odd })=\frac{3}{6}$. But if we are interested in $P(X_1=2 \text{ or } X_2\text{ is odd })$, we cannot simply add the probabilities like we did before, because the events are no longer disjoint. It is possible that the event $\{X_1=2 \text{ and } X_2\text{ is odd}\}$ happens.

Similarly in your example, if it is possible to win in the first trial and to win in the second trial then you cannot add the probabilities like that.

Provided the odds of winning one game is 1/6, how many games would you have to play to ensure you've won at least once?

There is no such number. If each game is independent then the probability of winning at least once in $n$ trials is equal to $1$ minus the probability of losing each time. Then,

$$\begin{align}P(\text{win at least once in }n\text{ trials}) &= 1-P(\underset{n\text{ times}}{\underbrace{\text{lose, lose,}\dots\text{, lose}})}\\ &= 1-\left(\underset{n\text{ times}}{\underbrace{\frac{5}{6}\times\frac{5}{6}\times\dots\frac{5}{6}}}\right) \\ &= 1-\left(\frac{5}{6}\right)^n \\&<1\end{align}$$

0
On

The problem is that you confuse the probability of an event with the average expectation of the event.

After six games we expect that you'll win one (on average, if you played millions of groups of 6 games).

On the other hand, the probability of actually winning a specific game can never be 1, because we can never know beforehand the outcome of the game.