Verifying the convexity of some function

408 Views Asked by At

Convex function: We will say that $f:X\rightarrow R$ is convex function if for every $\lambda\in [0,1]$ and for every $x,y\in X$ ($X$ is convex space) $f(\lambda x+(1-\lambda)y)\leq\lambda f(x)+(1-\lambda)f(y)$.

We have $F:R_{++}^2\rightarrow R$ by $F(x_1,x_2)=-x_1x_2-\alpha$ in which $R_{++}^2=[0,+\infty)\times [0,+\infty)$ and $\alpha\in R$ is arbitrary. Is $F$ function convex?

2

There are 2 best solutions below

0
On BEST ANSWER

I cannot tell you how many times I have had people insist that this function is convex. (I make software for convex optimization, and people seem rather insistent on being able to use this function in their models.)

It is not! End of discussion. The indefiniteness of the Hessian proves it.

But I think I may know why people may think this: because the inequality $F(x_1,x_2)\leq 0$ describes a convex set in $\mathbb{R}^2$. That is, $$\left\{(x_1,x_2)\,\middle|\,x_1,x_2\geq 0,~-x_1x_2 -\alpha\leq 0\right\} \quad\Longrightarrow\quad \left\{(x_1,x_2)\,\middle|\,x_1,x_2\geq 0,~x_1x_2 \geq -\alpha\right\} $$ is a convex set. It's the entire quadrant $\mathbb{R}_+^2$ if $\alpha\geq 0$, and one lobe of a hyperbola if $\alpha<0$. So perhaps people assume that if the inequality describes a convex set, the function must be convex; but this is not the case.

The fact is that $F$ is quasiconvex. A quasiconvex function $f$ is a function whose level sets $\{x\,|\,f(x)\leq\beta\}$ are convex for each fixed $\beta\in\mathbb{R}$. Every convex function is quasiconvex, but the converse is not true.

Now, the question is: given that the set described by the inequality is convex, how do I go about describing this inequality to convex optimization software that requires convex and/or concave functions? Well, if $\alpha\leq 0$, you can do this:

$$x_1,x_2\geq 0,~x_1x_2 \geq -\alpha \quad\Longleftrightarrow\quad x_1,x_2\geq 0,~\sqrt{x_1x_2} \geq \sqrt{-\alpha}$$

The geometric mean is a concave function on $\mathbb{R}_+^2$. Alternatively, you can do this:

$$x_2>0, ~ x_1 \geq -\alpha/ x_2$$

This isn't quite equivalent, because $x_2$ is constrained to be strictly positive. But many convex optimization solvers can't distinguish between strict inequalities and non-strict inequalities, open sets and closed sets, anyway. So this is practically equivalent.

6
On

This is most easily answered by checking whether the Hessian is positive semidefinite.

The Hessian is defined as $\nabla^2f(x)$ where $[\nabla^2f(x)]_{ij} = \dfrac{\partial^2 f}{\partial x_i \partial x_j}$, and in this case is easy to compute.