Probability and the Monty Hall problem

160 Views Asked by At

In this video is explained that during the Monty Hall problem you have a $\frac {2}{3}$ probability of winning if you always switch and a $\frac {1}{3}$ probability of winning if you never switch.

I understand the reasoning but it just feels wrong. Because if we can assume that the host always shows a goat after the first try, which implies that there are just 2 options left, you could still choose both options by switching or not.

Doesn't that conclude that there is always a $\frac {1}{2}$ probability of winning?

2

There are 2 best solutions below

0
On BEST ANSWER

Your calculation that choosing at random (with probability $\frac 12$ each) whether to switch or stay put gives a probability $\frac 12$ of winning the prize is perfectly correct.

Let $A$ be the event that your first choice is indeed the door that conceals the prize. Then, $P(A) = \frac 13$ and $P(A^c) = \frac 23$. Note that $A^c$ is the event that one of the two doors not chosen conceals the prize. Once one unchosen door has been opened, you flip a fair coin and switch or stay put according as the coin shows Heads or Tails. If $W$ is the event that you win the prize, then

\begin{align} W &= (A, T) \cup (A^c, H)\\ P(W) &= P(A)P(T) + P(A^c)P(H)\\ &= \frac 13\times \frac 12 + \frac 23\times \frac 12\\ &= \frac 12. \end{align}

The "always switch" strategy can be put into this framework by using a double-headed coin that always turns up Heads, leading to

\begin{align} P(W) &= P(A)P(T) + P(A^c)P(H)\\ &= \frac 13\times 0 + \frac 23\times 1\\ &= \frac 23 \end{align}

and similarly for the "always stay put" strategy, the chances of winning are $\frac 13$. I leave it to you as an exercise to figure out what the probability of winning is if you choose to toss a biased coin that turns up Heads with probability $p, 0 < p < 1$ and to show that this probability is smaller than the probability of winning with the $p = 1$ "always switch" strategy.

0
On

Rework the problem with 100 doors and 99 goats. You pick a door, the host then opens 98 doors you didn't pick all with goats. So of the original 100 doors only two are left, the door you picked and a door the host went through a bit of effort not to open.

So how does this feel? Now to me, I know when I pick one door at random out of 100 I'm going to get a bad door. I'm not that lucky. So I know the host is going to open all the other goat doors and leave the prize door closed. So should I switch. Of course, I only had a 1 in 100 chance of picking the correct door in the first place so there is a 99 in 100 chance that the goat was behind another door. Whatever door it was was the door the host kept closed. So there is a 99 in 100 chance that the prize is behind the other door.

The trick is to realize this isn't the probability of the prize being in a specific other door (the door wasn't specified). It's the probability the the prize wasn't behind your door.