optimizer of a supermodular function

201 Views Asked by At

Consider the problem $$\max_x \{xy+f(x)\},$$ where $x$ and $y$ are in $R^1$. Denote the optimal solution by $x^*(y)$. When multiple solutions exist, we take the smallest one. We are interested in the monotonicity of $x^*(y)$.

Argument 1: since the maximand of the problem is a supermodular function of $x$ and $y$, by the increasing optimal solution property of supermodular functions, we conclude that $x^*(y)$ increases in $y$.

Argument 2: if we take the first order derivative of the maximand (assume differentiability), we get $y+f'(x)$. Thus, necessarily $x^*(y)$ satisfies: $$f'(x)=-y.$$ When $y$ increases, to ensure that $x^*(y)$ increases in $y$, we need $f'(x)$ to be a decreasing function, i.e., a concave function.

My question is: is the concavity of $f(x)$ needed for the monotonicity of $x^*(y)$? It seems that the first argument does not need it, while the second does. Thanks.

1

There are 1 best solutions below

7
On

Let $f:\mathbb{R}\rightarrow\mathbb{R}$ be any function. Fix $y_1, y_2 \in \mathbb{R}$.

Let $x_1 \in \mathbb{R}$ be any (possibly non-unique) maximizer of $xy_1 + f(x)$ over $x \in \mathbb{R}$.

Let $x_2 \in \mathbb{R}$ be any (possibly non-unique) maximizer of $xy_2 + f(x)$ over $x \in \mathbb{R}$.

Here we are implicitly assuming the expressions $xy_1 + f(x)$ and $xy_2 + f(x)$ both have finite maximizers over $x \in \mathbb{R}$.

Claim:

We always have $(x_1-x_2)(y_1-y_2)\geq 0$, and so indeed the nondecreasing property holds: $$y_1 < y_2 \implies x_1 \leq x_2$$

Proof:

By definition of "maximizer" we have: \begin{align} x_1 y_1 + f(x_1) &\geq x_2y_1 + f(x_2) \\ x_2y_2 + f(x_2) &\geq x_1y_2 + f(x_1) \end{align} Summing these inequalities gives $$ x_1 y_1 + x_2y_2 + f(x_1) + f(x_2) \geq x_2y_1 + x_1y_2 + f(x_1) + f(x_2) $$ Simplifying the above inequality yields $(x_1-x_2)(y_1-y_2)\geq 0$. $\Box$


In particular, we do not need to define $x^*(y)$ as the infimum maximizer of $xy+f(x)$. For each $y\in \mathbb{R}$, we can define $x^*(y)$ as any (finite) maximizer of $xy+f(x)$, and still we have $$y_1 < y_2 \implies x^*(y_1) \leq x^*(y_2)$$


For an example relating to Justin's comment below: Take the differentiable but non-concave function $f(x) = -x^4 + x^2$. Then maximizers of $xy + f(x)$ exist for all $y \in \mathbb{R}$, and all maximizers are finite. Any maximizer $x^*(y)$ indeed satisfies the equation $-4x^3 + 2x=-y$, but there are generally three roots to that equation, not all of which correspond to maximizers. Further, $x^*(y)$ is a discontinuous function of $y$. Note that:

\begin{align} y=-0.000001 &\implies \arg\max_{x \in \mathbb{R}} [xy + f(x)] = \{-0.70711\}\\ y=0 &\implies \arg\max_{x \in \mathbb{R}} [xy+f(x)] = \{-0.7071067, 0.7071067\}\\ y = 0.000001&\implies \arg\max_{x \in \mathbb{R}} [xy + f(x)] = \{0.70711\} \end{align}

Here is a matlab plot of $x^*(y)$: enter image description here

In this example it can be shown that $x^*(y)$ is a strictly increasing function of $y$, and we indeed have $f'(x^*(y))=-y$ for all $y \in \mathbb{R}$. However, discontinuity of $x^*(y)$ allows $f'(x^*(y))$ to be decreasing in $y$ without requiring $f'(z)$ to be decreasing in $z$.