$\min_x \max_yf(x,y) = \min_y \max_x f(x,y)$

1.4k Views Asked by At

Let $f(x,y)$ be a real function of the variables $x,y$ (which can be real vectors). Under what conditions do we have the following equality:

$$\min_x \max_yf(x,y) = \min_y \max_x f(x,y)$$

For example, this equality is true if $f(x,y) = xy$ and $x,y$ are real scalars.

Note that this is not the same as Von Neumann's minimax theorem (https://en.wikipedia.org/wiki/Minimax_theorem), because here the role of the variables is exchanged (e.g., $x$ is minimized on the left-hand side, but it is maximized on the right-hand side).

Though I do not know if convexity/concavity of $f(x,y)$ with respect to either arguments plays a role here (like it does for Von Neumann's minimax), I am using the convex-related tags here since that's the context where I've seen related questions. Similarly I am tagging game-theory, though I'm not sure it's directly applicable.

2

There are 2 best solutions below

0
On

Here is a tentative proof, under some assumptions. Would like to see some additional arguments (or counterexamples) to make this clearer.

Assumptions:

  1. We suppose that $f(x,y)$ is concave in $y$ for all $x$.
  2. That the equation $\partial f/\partial y=0$ has a unique solution $x$ for every $y$.
  3. That the solution to both saddle-point optimizations is attained at a point where $\partial f/\partial x = \partial f/\partial y = 0$.

Proof:

Then finding $\max_y f(x,y)$ is equivalent to $\partial f/\partial y = 0$. It follows that the original problem is equivalent to a constrained minimization over all variables:

$$\begin{aligned} \min_x \max_y f(x,y) &= \min_{x,y} f(x,y) \quad \left(\text{subject to }\frac{\partial f}{\partial y}=0\right) \\ &= \min_y \min_x f(x,y) \quad \left(\text{subject to }\frac{\partial f}{\partial y}=0\right) \end{aligned}$$

where we simply changed the order of the minimizations. Since there is a unique $x$ that makes $\partial f/\partial y=0$ for any $y$, the inner minimization on $x$ is trivial, and can be formally changed to a maximization:

$$\begin{aligned} \min_x \max_y f(x,y) &= \min_y \max_x f(x,y) \quad \left(\text{subject to }\frac{\partial f}{\partial y}=0\right) \end{aligned}$$

Finally, if the unconstrained solution to this last problem automatically satisfies $\partial f/\partial y=0$ (this is our third assumption above), then the constrain can be lifted, and we obtain:

$$\begin{aligned} \min_x \max_y f(x,y) &= \min_y \max_x f(x,y) \end{aligned}$$

0
On

Are you trying to find an "iff" condition?

For sufficiency conditions, you could consider the followings:

  1. $f$ is symmetric
  2. $f$ is ordinally symmetric. This means that if $f(a,b)>f(c,d)$, then $f(b,a)>f(d,c)$

(2) is weaker than (1). But, if $x,y$ are vectors and $f$ is a linear functional (i.e. vNM representation), then (2) and (1) are equivalent.

What you are suggesting above looks like some single-crossing properties, which can be even weaker.