Probability question: optimal strategy

1.1k Views Asked by At

I am really confused about how to think about this question. It was presented as a challenge by a peer.

Two people seek to kill a duck at a location $Y$ meters from their origin. They walk from $x=0$ to $x=Y$ together. At any time, one of the two may pull out their gun and shoot at the duck, however, the probability that person A hits is $P_{A}(x)$ and the probability that person B hits is $P_{B}(x)$. It is also known that $P_A(0)=P_B(0)=0$ and $P_A(Y)=P_B(Y)=1$ and both functions are increasing functions.

What is the optimal strategy for each player?

4

There are 4 best solutions below

6
On BEST ANSWER

I believe both should shoot at $P_A(x)+P_B(x)=1$. If either shoots earlier, the chance of winning is reduced. If either shoots later, the other could wait half as much later and have a better chance of winning. But what happens if they both hit or both miss?

2
On

Suppose player $A$ takes a shot at distance $x$, before player $B$. He collects the price with $P_1 = p_A(x)$ probability, while player $B$ collects the price with probability $P_2 = 1-p_A(x)$.

If the player $B$ shoots first, then $A$ wins with probability $Q_1 = 1-p_B(x)$ and $B$ wins with $Q_2 = p_B(x)$.

The optimal strategy for $A$ is to shoot at the point minimizing $B$'s win, i.e. $x_A = \operatorname{argmin}_x \max(p_B(x), 1-p_A(X))$, while the optimal strategy of $B$ is to shoot at $x_B = \operatorname{argmin}_x \max(p_A(x), 1-p_B(X))$.


Here is a visualization, assuming duck is located at $Y=1$, and $p_A(x)$ and $p_B(x)$ are beta distribution cumulative distribution functions:

enter image description here

2
On

Suppose the probability functions $P_A, P_B$ are continuous, and that "increasing" means "non-decreasing". Then there is a unique maximal closed interval in which $P_A(x) + P_B(x) = 1$. Each player's strategy is identical: shoot at any time in this interval. (Ross Millikan simul-posted this answer.)

It gets a bit more complicated if $P_A$ or $P_B$ is not continuous. This is a realistic scenario $-$ for instance, the brow of a hill might obscure the duck up to a certain point (which might be different for each player). Then there might be a point $x$ before which $P_A + P_B < 1 - a$, and after which $P_A + P_B > 1 + b$, for some strictly positive $a,b$. There are two cases:

  1. $P_A$ is continuous at $x$, and $P_B$ is not. Then $P_A$ must shoot before $x$, but as shortly before $x$ as possible. Likewise if $A$ and $B$ are swapped.

  2. Neither $P_A$ nor $P_B$ is continuous at $x$. Then neither player wants to shoot before $x$, and neither player wants to allow the other to shoot after $x$. The situation becomes tense, and mathematics has little to say; in fact, the game should perhaps be called "chicken" in this case, rather than "duck".

0
On

I think that we should be explicit about the payoffs in the case they both hit simultaneously and about the concept of optimality. I suppose that we search for the Nash equilibrium.

Let's consider 3 versions:

Fair-hunters version. The first who hits the duck gets +1. If both hit simultaneously they get +1/2 each. No Nash equilibrium in pure strategies. Simultaneous shot can not be an equilibrium because both players want to deviate and shoot $\varepsilon$ earlier.

Hunters-enemies. The first who hits the duck gets +1. If both hit simultaneously they get 0 because of a quarrel. No Nash equilibrium as in fair-hunters version.

Brothers-hunters. The first who hits the duck gets +1. If both hit simultaneously they get +1 each (they are very surprised and happy). In this case any pair of strategy $(x,x)$ such that $P_A(x)+P_B(x)\geq 1$ is a Nash equilibrium.