I have a three-player duel in which players A,B and C pick a time t in the interval [0,1] to fire at a common target and they can only fire once. When player A fires at time t, he will hit with probability t. Similar for player B and C, but they will hit with probability t^2 and t^3 respectively. When a player has fired, their opponents will know and are allowed to fire at a different time than what they initially had in mind. I'm asked to calculate an optimal strategies for all players.
The two-player variant of the game has already been solved by. Say A and B are playing the same duel with two players, then an optimal strategy for A is to fire at a time t such that t = (1 - t^2) and similar for B.
Using the same method as in the two player duel, A should fire at a time t such that t0 = (1-t^2) * 0.682 in which 0.682 is the time t where player A and C should fire if they play a two player duel. A similar equation holds for player B, he fires at a time t1 such that t^2 = (1 - t)0.755^2 in which 0.755 is the time where player B and C should fire if they play a two player duel.
When solving these equations, I find that t0 = 0.507 and t1 = 0.522 which are not equal. This solution is therefore not a Nash equilibrium, because player A could also fire at t = 0.521 for a higher probability of winning.
Is firing at t0 for player A and at t1 for player B an optimal strategy? I thought that an optimal strategy minimizes a players loss their opponents can do and is not necessarily equal to a Nash equilibrium.
Thanks in advance!
To clarify the two-player duel. At any time t, A will hit with probability t and B will hit with probability t^2. The optimal time for both players to fire is when t = (1 - t^2). This is when As probability of hitting and winning is equal to As probability of winning when B fires at time t.
I'll try to analyze a slightly more general two-player game first.
We consider a family $\mathcal F$ of non-decreasing real-valued functions such that for each function $p$ in the family, $0 \leq p(t) \leq 1$ for all $t$; $p(t) < p(t')$ whenever $0 \leq t < t' \leq 1$; and $p(1) = 1.$ Each player is associated with one function from this family.
For the two-player game between any two players X and Y, we have a function $p_X\in\mathcal F$ associated with X, a function $p_Y\in\mathcal F$ associated with Y, and some number $t_S < 1.$
To play this game, players X and Y each secretly chooses a number $t_X,$ $t_Y$ (respectively) within the range $[t_S, 1].$ Neither knows what number the other has chosen until the numbers are simultaneously announced. After the numbers $t_X$ are $t_y$ are announced, one of two procedures is followed, either "X shoots first" or "Y shoots first." If $t_X < t_Y$ then X shoots first; if $t_Y < t_X$ then Y shoots first; if $t_X = t_Y$ then either X shoots first or Y shoots first with equal probability.
If X shoots first then X wins with probability $p_X(t_X)$ and Y wins with probability $1 - p_X(t_X).$ (In effect, if X does not win, Y is then allowed to choose the number $1$ instead of $t_Y$ and therefore wins with probability $1.$) If Y shoots first then Y wins with probability $p_Y(t_Y)$ and X wins with probability $1 - p_Y(t_Y).$
So there are only two outcomes to this game: X wins or Y wins. Assume each player wants to maximize their chance of winning.
Let $t_E$ be a time such that $p_X(t_E) + p_Y(t_E) = 1.$ Then if $t_E \geq t_S,$ the combined strategy $t_X=t_Y=t_E,$ so that X wins with probability $p_X(t_E)$ and Y wins with probability $p_Y(t_E),$ is an equilibrium, because if either player chose a smaller number they would reduce their chance to win, but if they chose a larger number then the other player could also increase their number (by a lesser amount) and increase their chance to win.
If $t_E < t_S$ then the equilibrium is $t_X=t_Y=t_S,$ because $\frac12 p_X(t_S) + \frac12 (1 - p_Y(t_S)) > (1 - p_Y(t_S))$ (the left side being the worst case if X chooses $t_S$ and the right side being the worst case if X chooses a greater number), so that $t_S$ dominates all other choices for X, and because $t_S dominates all other choices for Y for similar reasons.
For the three-player game, each of the players A, B, and C is associated with the functions $p_A,$ $p_B,$ and $p_C$ (respectively) in the family $\mathcal F.$ Players A, B, and C each secretly chooses a number $t_A,$ $t_B,$ $t_C$ (respectively) within the range $[0, 1].$ Then the three numbers are announced and one player is selected to "shoot first." If one player chose a number smaller than both other players' numbers, that player shoots first; if there is a "tie" for smallest number then each person choosing that number has an equal chance to shoot first. Whichever player shoots first wins with probability $p_Z(t_Z),$ where $p_Z\in\mathcal F$ is the function associated with that player and $t_Z$ is the number chosen by that player. If that player does not win, the winner of the three-player game is the winner of a two-player game between the remaining two players with $t_S = t_Z.$
For your particular three players, $p_A(t) = t,$ $p_B(t) = t^2,$ and $p_C(t) = t^3,$ so that the equilibrium strategy for a game between A and C is $t_A=t_C\approx 0.68233$ and for a game between B and C is $t_B=t_C\approx 0.75488$ (provided that the $t_S$ is not greater). The probability that C wins is only about $(1 - t_A)0.75488^3$ if A shoots first (with $t_A < 0.75488$) and only about $(1 - t_B^2)0.68233^3$ if B shoots first (with $t_B < 0.68233$).
So C should allow A to shoot first if and only if $t_A < 0.569842$ (because then $p_C(t_A) = t_A^3 < (1 - t_A)0.75488^3$) and should allow B to shoot first if and only if $t_B < 0.591205$ (because then $p_C(t_B) = t_B^3 < (1 - t_B^2)0.68233^3$); otherwise C would be better off shooting first.
The probability that B wins is about $(1 - t_A)0.75488^2$ if A shoots first (with $t_A < 0.75488$), so B should only allow A to shoot first if $t_A < 0.521939,$ because then $$p_B(t_A) = t_A^2 < (1 - t_A)0.75488^2. \tag1$$
But should A set $t_A < 0.521939$? Suppose $t_A = 0.521$; then A wins with probability $0.521.$ But if $t_A = t_B = 0.521939$ then A wins with probability $\frac12(0.521939) + \frac12(1 - 0.521939^2)0.68233 \approx 0.50919.$ So it would seem advantageous to A to set $t_A$ low enough at the start of the three-player game so that it is against B's interest to set $t_B$ to the same value or less. But this is an awkward answer, because while there is a very well-defined minimum value of $t_B$ that B would rationally choose, namely, the solution of Equation $(1),$ there is no "next lower" real number.
The dominating strategy for B seems to involve initially setting $t_B$ to the solution of Equation $(1),$ approximately $0.521939,$ so it seems that is what B will do. Player A then has to make an arbitrary choice of some value of $t_A$ that is smaller than this value of $t_B$; there is no "best" value of $t_A,$ only values that are so close to $t_B$ that the difference might be considered negligible. Player A will shoot first, and B and C will then set both $t_B$ and $t_C$ equal to the solution of $t^2 + t^3 = 1,$ approximately $0.75488.$