Say "red" with playing cards

1.2k Views Asked by At

Assume you have a set of playing cards, with 26 black and 26 red. You reveal one card at a time, and before every card is revealed you can say "red".(you only get one chance) If it red then you win, otherwise you lose. What's the strategy to maximise the winning chance?

Some simple strategies(don't know if any of them works):

  • first card $(50\%)$
  • when there are more red cards left (might not happen)
  • after 25 red cards gone
3

There are 3 best solutions below

0
On BEST ANSWER

It doesn't matter what you do. There may be a simpler way to see this, but the straightforward way is by induction on the size of the deck.

I claim that your win probability with $n$ red and $m$ black cards is always $p_{n,m}=\frac{n}{n+m}$. This is certainly true when $n+m=1$, where you are obligated to say "red" immediately.

For larger deck sizes, you have two possible strategies. If you say "red" immediately, you will win with probability $\frac{n}{n+m}$. Alternately, you could wait one round, and then play from the resulting position. In this case you will win with probability $$ \frac{n}{n+m}p_{n-1,m}+\frac{m}{n+m}p_{n,m-1} $$

But by the induction hypothesis, this is equal to $$ \frac{n}{n+m}\frac{n-1}{n+m-1}+\frac{m}{n+m}\frac{n}{n+m-1}=\frac{n(n-1+m)}{(n+m)(n+m-1)}=\frac{n}{n+m} $$

So in fact, in either case you will win with probability $\frac{n}{n+m}$.

2
On

What we have to do is define hitting times.

A hitting time is a pair $(i,j)$ in which we should yell red when there are $i$ red and $j$ black remaining.

In order to determine hitting times efficiently we can do it recursively.

In order to do so we also calculate $f(i,j)$ recursively where $f(i,j)$ is the probability that we win when there are $i$ red and $j$ black balls if we follow the strategy perfectly.

Then we can calculate $f(i,j)$ as $\max( \frac{i}{i+j}, \frac{i}{i+j}f(i-1,j) + \frac{j}{i+j}f(i,j-1)$

The value $(i,j)$ is a hitting time precisely when the number on the left is larger.

One clearly has the values $f(0,j)=0$ and $f(i,0)=1$ for positive $i$ and $j$.

So we just have to run that recursion on the $27^2$ values and see what comes up (im doing this now)


Its pretty easy to prove that $f(i,j)=\frac{i}{i+j}$ by induction on $i+j$

This is simply because $\frac{i}{i+j}(\frac{i-1}{i+j-1}) + \frac{j}{i+j}(\frac{i}{i+j-1})= \frac{i^2 -i}{(i+j)(i+j-1)}+\frac{ij}{(i+j)(i+j-1)}=\frac{i(i+j-1)}{(i+j)(i+j-1)}=\frac{i}{i+j}$


So surprisingly it doesnt matter what you do, the probablity that the last ball comes out black is $\frac{i}{i+j}$ also !

0
On

I don't find it surprising that 50/50 is the best (and worst) you can do. One way to start to see this is to just think to yourself: would I be able to make money playing this game if I got a dollar every time I won and lost a dollar when I didn't win? I think your gut reaction would be that you couldn't, primarily due to the symmetry of the situation.

We can push this further. Imagine two players $R$ and $B$. Let's say that $B$ plays the mirror image strategy of whatever $R$ does. That is, if $R$'s strategy is "wait until you see five black cards, then say 'Red'", $B$'s strategy is "wait until you see five red cards, then say 'Black'". Which should you bet on? It should be clear that if some strategy is good for $R$, then the mirror strategy is just as good for $B$. Thus, no matter the strategy, the probability that $R$ wins must equal the probability that $B$ wins.

The previous paragraph just puts the problem into a form where the Principle of Indifference can (more obviously) be applied. As the Principle of Indifference is often misused, here's a description of what is going on much more formally. Let $P(\varphi\mid I)$ be the probability that the proposition $\varphi$ holds given $I$ is true. In this case, $I$ represents the background information we have, e.g. the rules of the game, the setup of the problem, and any other relevant information. Let $\psi[\text{Red}\leftrightarrow\text{Black}]$ be the formula $\psi$ where all occurrences of "Red" are replaced with "Black" and vice versa, similarly for terms. We trivially have $$P(\varphi[\text{Red}\leftrightarrow\text{Black}]\mid I[\text{Red}\leftrightarrow\text{Black}])=P(\varphi\mid I)$$ which is just the fact that probabilities don't change just because we used different words for things. To apply the Principle of Indifference, we need to show that $I$ is logically equivalent to $I[\text{Red}\leftrightarrow\text{Black}]$. If that is true, then $$P(\varphi\mid I)=P(\varphi\mid I[\text{Red}\leftrightarrow\text{Black}])$$ from which it follows that $$P(\varphi[\text{Red}\leftrightarrow\text{Black}]\mid I)=P(\varphi\mid I)$$ In this case, if $\varphi$ is the proposition "the chosen strategy $S$ correctly says 'Red'", then $\varphi[\text{Red}\leftrightarrow\text{Black}]$ is the proposition "the chosen strategy $S[\text{Red}\leftrightarrow\text{Black}]$ correctly says 'Black'". In this case, $S[\text{Red}\leftrightarrow\text{Black}]$ is exactly what I meant by "the mirror strategy".

We need to show $I\equiv I[\text{Red}\leftrightarrow\text{Black}]$. The main fact that is relevant is $$(26\text{ cards are Red}\land 26\text{ cards are Black})$$ If we swap "Red" and "Black" we get $$(26\text{ cards are Black}\land 26\text{ cards are Red})$$ which is clearly logically equivalent. Of course, if the statement had been $$(40\text{ cards are Red}\land 12\text{ cards are Black})$$ swapping "Red" and "Black" would not produce a logically equivalent statement and thus we wouldn't be able to apply the Principle of Indifference.