convexity of a multivariable function

823 Views Asked by At

I have a function of the following type:

$f(x_1,x_2,...,x_n)$ Each $x_i$ has domain $[0,\infty)$. The function is continuous and differentiable in each variable (It is an expectation of several continuous probability distributions). My numerical experiments show that the function is convex, and has an optimal solution for which the gradient is 0 in every direction, i.o.w. $x_i=0$, and $x_i=\infty$ are not optimal.

Now, my goal is to prove that there exists an optimal solution, which is unique.

I tried to proving convexity by showing that the hessian is positive definite.

I already proved that the function is strictly convex w.r.t. each variable seperately, by showing for each i that $\frac{\partial ^2 f}{(x_i)^2}>0$

These terms form the diagonal of the Hessian. For the nondiagonal elements I know the following properties:

Each element not on the diagonal is negative. $$ a_{ij}<0, j\neq i $$

the absolute value is is smaller than the value of the element on the diagonal. $$a_{ii}>|a_{ij}|, \forall j \neq i$$

My questions: 1) Are there other ways to prove convexity other than computing the determinant of the hessian?

2) Do I really need convexity to show that my function has one global minimum or are there other ways to show this?