Is the negative normalized entropy convex?

440 Views Asked by At

The negative normalized entropy is defined as $$h:\mathbb{R}_{>0}^n \rightarrow \mathbb{R} \ , \ h(x)=\sum_{i=1}^n x_i\log \frac{x_i}{\sum_{j=1}^n x_j} \ .$$ Is this function convex?

Its Hessian is given by $$H_{ij} = \frac{\partial^2 h}{\partial x_i\partial x_j}(x) = \frac{\delta_{ij}}{x_j} - \frac{1}{\sum_{j=1}^n x_j} \ ,$$ so the question is equivalent to asking if this matrix is positive semi-definite for all $x \succ 0$.

One way to do this would be to show that for any $v \in \mathbb{R}^n$, $$v^T H v = \sum_{i=1}^{n} \frac{v_i^2}{x_i} - \frac{\left(\sum_{i=1}^{n}v_i\right)^2}{\sum_{j=1}^n x_j} \geq 0 \ ;$$ however, I am not sure if this holds.

With this convexity shown, I would be able to provide a good answer for this open question: Conjugate of negative normalized entropy

2

There are 2 best solutions below

0
On BEST ANSWER

It is convex. Since $h$ is homogeneous of degree $1$ on $\mathbb{R}_{>0}^n$ (and $H$ is homgeneous of degree -1) one can wlog assume that $\sum_{i=1}^n x_i = 1$ when proving that $H$ is positive semi-definite.

For such $x$, it boils down to proving that for any $v \in \mathbb{R}^n$, $$v^T H v = \sum_{i=1}^{n} \frac{v_i^2}{x_i} - \left(\sum_{i=1}^{n}v_i\right)^2 \geq 0 .$$

By letting $v_i =x_i w_i$, this is equivalent to proving that for any $w \in \mathbb{R}^n$, $$\sum_{i=1}^{n} x_i w_i^2 - \left(\sum_{i=1}^{n}x_i w_i\right)^2 \geq 0 ,$$ which is true by Jensen's inequality.

0
On

Each term in $h$ is convex because it is the perspective of the convex function $f(x) = x\log x$ (under an affine transformation of the $y$-variable of the perspective). This insight should make it trivial to derive the conjugate function based on the conjugate of $f$.