Suppose we have a two-player zero-sum game. Let G be the loss of the row player. Say the row player's mixed strategy is p while the column player's is q. We define the minmax solution $(p^*,q^*)$ as follows:
$$p^*=\arg\min_p\max_qG(p,q)$$ $$q^*=\arg\max_q\min_pG(p,q)$$
Suppose now we have $\tilde{q}=\arg\max_qG(p^*,q)$. Is $(p^*,\tilde{q})$ a nash equilibrium?
Here is my approach, I want to show $\tilde{q}$ is also a minmax solution i.e. $$\min_p G(p,q^*)\leq\min_p G(p,\tilde{q}).$$ We have \begin{align*}G(p^*,\tilde{q})&=\max_qG(p^*,q)\quad(\text{definition of $\tilde{q}$} )\\ & = \min_p\max_q G(p,q) \quad(\text{definition of $p^*$} )\\ &= \max_q\min_p G(p,q)\\ & =\min_pG(p,q^*). &\end{align*} Then I get stuck. What should I do next or in general the statement is wrong? (I think it is right though.)
The answer is surprisingly no. Consider the classical Rock-Paper-Scissors game. $p^*$ and $q^*$ would be the uniform strategies. In this case, the best response to $p^*$ can be either rock, paper, or scissors. They both have the same probability to win. We can pick $\tilde{q}$ that has probability 1 on playing scissors then it is clear that $(p^*,\tilde{q})$ is not a nash equilibrium since player one would like to play rock.