Solving a Min-Max problem analytically.

545 Views Asked by At

When studing the asymptotic behaviour of an estimator I encounter the situation that for any $a, b, c \in [0,1]^3$ a sequence $A_n$ is $$ A_n = \mathcal O\left( n^{-a} + n^{-mb} + n^{c-1} + n^{-c(q+m) + b + a -1 } \right).$$ Where $q \geq 1$, $m \geq 2$ are fixed natural numbers. By properties of big O the greatest exponent will determine the rate of convergence of $A_n$. Hence i wish to solve the following $\min-\max$ problem

$$ \min_{(a,b, c) \in [0,1]^3 } \max \left\{ -a \, ;\, -mb \, ;\, c-1 \, ;\, {-c(q+m) + b + a -1 } \right\}.$$

How can I find the solution of this problem in terms of $q$ and $m$?

In many papers and books this problem is usually hidden and they just propose an appropiate (a,b,c). Sometimes not even mentioning optimality of the choice.


Example

I have tried by inspection. For example obtained a value of $ -\frac{m+q+1}{m +q +2}(1-\frac{1}{m})$ for

$$ (a,b,c) = \left(\frac{m+q+1}{m+q+2}, \frac{m+q+1}{(m+q+2)m}, \frac{1}{m+q+2}\right)$$

Though it feels there is some room for improvement.

1

There are 1 best solutions below

1
On BEST ANSWER

Suppose I find a $(a,b,c)$ such that all four functions are equal. Then altering any of them will increase the $\min\max$. For instance, increasing $a$ will increase $-c(q+m) + a + b - 1,$ and decreasing it will increase $-a$. Similarly for $b$ and $c$. So, if such a point exists, it must attain the minimax.

Let us try to find such a point. We must have $ a = mb$ and $a = 1-c$, giving $b = (1-c)/m$. For feasible $c,$ $(1-c, 1-c/m)$ are feasible values of $(a,b),$ so far so good.

Finally, we must have $$ c-1 = -c(q+m) -1 + (1-c)(1 + 1/m) \iff 1-c = \frac{m + q + 1}{m + q + 2 + 1/m},$$ which is a feasible point as well. So we're done, and the minimax rate is negative of the above.

Strictly speaking I've only argued that the above is the only local minima. But - the $\max$ of affine functions, as above, is a convex function, and so any local minimum, if it lies in the feasible set, is the global minimum.


More generally, the above problem has a particular structure that can be exploited. Note that $$ \min_{a,b,c}\max\{f_1(a), f_2(b), f_3(c), g(a,b,c)\} = \min_{a,b} \max \{f_1(a), f_2(b), \min_{c}\max\{f_3(c), g(a,b,c)\} \}$$

This follows simply by noting that the outer minimisation can be written as an iterated minimisation, and the $\max$ of the four terms can be written as $\max\{f_1(a), f_2(b), \max\{f_3(c), g(a,b,c)\}\}$, and then we can push the inner minimisation over the terms that don't depend on it.

This allows us to 'peel-off' variables, making the problem simpler in each step. This may not always be tractable, but in this case it certainly is, and gives the same point.