How to prove that unnormalized neg entropy is strongly convex with respect to the $1$-norm?

1.1k Views Asked by At

The unnormalized negative entropy of $\mathbf{x} \in \mathbb{R}^n_+$ is $$ g(\mathbf{x}) = \sum_i (x_i \log(x_i) - x_i) $$ it is stated that $g(\mathbf{x})$ is strongly convex with respect to 1-norm, i.e. $$ \sum_i\left(x_i \log\left(\frac{x_i}{y_i}\right) + y_i - x_i\right) \geq \frac{\lambda}{2} \left(\sum_i |x_i - y_i|\right)^2 $$ for a certain $\lambda > 0$. I tried and looked around, but couldn't establish this inequality.

Thanks for your help.