Coin flipping and gamble game

138 Views Asked by At

Suppose a coin game for two players. At the beggining, both of the players have $n$ credits. On every round, the turn's player chooses how much they'll gamble, if it is 1 or 2 credits (of course you can only choose to gamble 1 if you or the other player have only one credit). Then, the coin is flipped: if it lands heads, player A wins; if not, player B wins. The winner gets all the gambled coins and it's the turn of the other player. The game ends when one player gets all of the coins to themselves. Consider the following game as an example:

(A and B each start with 4 credits)

Round 1, A's turn:

A chooses to gamble 2 credits. It lands tails, so B wins. A has 2 credit and B has 6.

Round 2, B's turn:

B chooses to gamble 1 credit. It lands heads, so A wins. A has 3 credit and B has 5.

Round 3, A's turn:

A chooses to gamble 2 credits. It lands tails, B wins. A has 1 credit and B has 7.

Round 4, B' turn:

B is forced to gamble 1 credit. It lands tails, B wins. A has 0 credits and B has 8.

The game ends, and B is the winner.

Now, let's consider the player's strategies: A: chooses (when possible) randomly between 1 and 2 to gamble; B: if has more credits than A, gambles 2 (when possible); if not, only 1 (that's to say if B has $x$ credits, then $x \leq n \Rightarrow 1$; $x > n \Rightarrow 2$).

Does B have more chances of winning than A?

3

There are 3 best solutions below

0
On BEST ANSWER

The strategy has no bearing on the impact of the game. You can equivalently consider this game to be a fair random walk, starting at $N$, ending at either $0$ or $2N$ and with the option to select whether the next move will take you $2$ or only $1$ step back or forward (the choice alternates between you and an opponent).

Denote $P(X)$ as the probability we hit $2N$ before hitting $0$ starting from $X$.

Now in any strategy, consider the last step where either player selects $2$ as the jump size, after which we will use $1$ as long as the game goes on. (This point must exist), and let's assume the game is at point $X$

At this point, if the player selects $2$, chances of reaching $2N$ chances are $0.5*(P(X-2)+P(X+2))$.

If player selects $1$ instead, chances of reaching $2N$ are now $0.5*(P(X-1)+P(X+1))$.

Since we have that the game afterwards is an unbiased random walk, we use (from theory) the fact that the probability of a random walk at $X$ going to $2N$ is merely $X/2N$. Hence, by substituting this into the equations above, we find that both options are equivalent.

The completion of the proof simply takes this "backwards" to show that at no point, does the selection of $2$ versus $1$ matter.

The crux of this depends on the fact that the probability of reaching $2N$ is linear in terms of $X$, ie, $f(A+B)=f(A)+f(B)$. Otherwise, the player currently winning/losing might select $2$ over $1$.

And, since we showed this game is equivalent to a random walk, starting at the midpoint, then it follows that it is a fair $1/2$ game.

2
On

I don't think that B has more chance than A, the reason behind that is that the expected value of his plays, in both scenario is exactly the same: $$ X_{B_1} = \begin{cases} +1$ \text{ with }\mathbb{P}=1/2 \\ -1$ \text{ with }\mathbb{P}=1/2 \\ \end{cases} \implies \mathbb{E}[X_{B_1}] = 0 $$ and also $$ X_{B_2} = \begin{cases} +2$ \text{ with }\mathbb{P}=1/2 \\ -2$ \text{ with }\mathbb{P}=1/2 \\ \end{cases} \implies \mathbb{E}[X_{B_2}] = 0 $$

while instead A's strategy $$ X_{A} = \begin{cases} +1.5$ \text{ with }\mathbb{P}=1/2 \\ -1.5$ \text{ with }\mathbb{P}=1/2 \\ \end{cases} \implies \mathbb{E}[X_{A}] = 0 $$ (that's because he bets with same probability 1\$ or 2\$, that's 1.5\$ on average)

What does it change is the variance of the two strategies $$ Var(X_{B_2}) = \mathbb{E}[X_{B_2}^2] - \mathbb{E}[X_{B_2}]^2 \\ = \mathbb{E}[X_{B_2}^2] = (+2)^2\cdot\frac{1}{2} + (-2)^2\cdot \frac{1}{2} = 2 $$ And similarly you can prove that $Var(X_{B_1}) = 1$ and that $Var(X_A) = 2.25$

In the end I think that the strategy adopted by B only affect the duration of the game and the variance, but not the expected value, which remain 0 for both players.

1
On

I think that the strategy has a significant bearing on the impact of the game with the result being that B has less chances of winning than A.

Here's how I see it:

I'm assuming that both the players are intelligent and competent and trying to win.

With only the given rules and no strategies defined

Here, the outcome would be that both the players will never bet anything except $1$.
The reason for this is simple, that, when a person bets something, they either don't get that amount (and worse it goes to the other player) or they otherwise get it back (and that's it!).
So, it's only obvious they would bet the minimum amount.
Further, in such a scenario, assuming that the player A always goes first, so, s/he is at slight disadvantage and the player B will have greater chances of winning.
This can be easily noticed in the case when both the players start with only $1$ coin (that is, $n=1$).
The probability that the player A loses or B wins is: $$=(0.5)+(0.5)(0.5)^2+(0.5)(0.5)^4+\dots \\ =\frac{0.5}{1-(0.5)^2} \ \text{ (Using sum to infinite GP)} \\\Rightarrow P(\text{B wins})=\frac23$$

Consequently, $$P(\text{A wins})=1-\frac23=\frac13 \ (=(0.5)(0.5)+(0.5)(0.5)(0.5)^2+(0.5)(0.5)(0.5)^4+\dots)$$

Now let's look with the defined strategies:

Here let's see mathematically that why betting only $1$ coin is better than $2$. $$E(X=1)=(0.5)(+1)+(0.5)(0)-1=-0.5$$ $$E(X=2)=(0.5)(+2)+(0.5)(0)-2=-1$$ and that's why betting only $1$ coin is better (because then we lose less).

But, given that the player A randomly chooses between 1 or 2 coins to gamble, so her/his expected gain/loss: $$E(A)=0.5(-0.5)+0.5(-1)=-0.75$$ And similarly, for player B: $$E(B)=1(-1)=-1$$.

Now, just from that, it's obvious enough that player B is at a disadvantage (and that it'll likely overshadow the minute advantage by not betting first - intuitively).

Now that we know the expected gain/loss for each player and assuming that the player A goes first, here's how the game proceeds:

For $1$ coin:
1 coin

For $2$ coins:
2 coins

For $3$ coins:
3 coins

So, ya, basically, it's the same as that the outcome for $1$ coin multiplied by the number of coins that we have in the beginning.


Final Result (also Tl;dr):

A has more chances of winning than B.
For $n$ coins, the expected length of the game (resulting in A's win) is $8n$ turns.