Checking the convexity of an optimization problem using the Hessian

1.1k Views Asked by At

I am trying to prove that the following optimization problem is convex.

$$\begin{array}{ll} \text{minimize} & x_1 x_2^2\\ \text{subject to} & x_1=x_2\\ & x_1\geq 0\end{array}$$

I calculated the Hessian matrix

$$H = 2 \begin{bmatrix} 0 & x_2 \\ x_2 & x_1 \end{bmatrix}$$

It must be positive semidefinite over the whole domain for it to be a convex, as least that's what I know, but it's not positive semidefinite. For instance for $x_1=x_2=1$,

$$\det(H)=-4$$

so it's not positive semidefinite and it's not convex. I can't find what I am doing wrong.

1

There are 1 best solutions below

0
On

An equivalent problem is $$ \underset{x}{\text{minimize}}\quad x^3\\ \text{subject to}\quad x\geq0 $$ The Hessian for this problem is $H=\frac{\partial^2 x^3}{\partial x^2}=6x$, which is $H\geq 0$ for all $x\geq 0$.