Equivalence of minimax and nash equilibrium in finite zero sum games

331 Views Asked by At

Consider a two player zero sum game, with $X\subseteq \mathbb{R}^n$,$Y\subseteq \mathbb{R}^m$ being the moves of players 1 and 2, and payoff function $f:X \times Y \rightarrow \mathbb{R}$

Now let $(x^*,y^*)$ be a Nash equilibrium, in the case of a zero sum game, the equilibrium condition is equivalent to $\sup_{X}f(x,y^*) = f(x^*,y^*) = \inf_{Y}f(x^*,y)$ .

Because of the inequality $$\sup_X\inf_Yf(x,y) \leq \inf_Y\sup_Xf(x,y)$$ and the equilibrium condition above, it is clear that a nash equilibrium is also a point that satisfies $$ \sup_X\inf_Yf(x,y) = \inf_Y\sup_Xf(x,y) = f(x^*,y^*)$$ namely the minimax equality. The opposite need not be true, for example $X = [0,1] , Y = ]0,1] , f(x,y)=xy$ has points that satisfy the minimax equality but not the equilibrium condition.

However, I have read in many textbooks (without finding any proof) that in the case of finite games, the two conditions are in fact equivalent, and minimax points are also Nash equilibria. I would guess that it has to do with the fact that the in mixed strategies of a finite game, $X$ and $Y$ are compact and convex, and $f$ is continuous (in fact it is bilinear). How would one go about proving it?