Inequality involving log-sum-exp, variance, and mean

112 Views Asked by At

Fix $z_1,\ldots,z_n \in \mathbb R$. Let $\mu_n:= mean(z_1,\ldots,z_n):=\frac{1}{n}\sum_{i=1}^n z_i$, $lse_n(z_1,\ldots,z_n)=\log(\sum_{i=1}^n e^{z_i})$, and $\sigma^2_n := variance(z_1,\ldots,z_n):=\frac{1}{2n(n-1)}\sum_{i}^n\sum_j^n(x_i-x_j)^2$.

Question

Is there any inequality linking $\mu_n$, $\sigma^2_n$, and $lme_n$ ?

1

There are 1 best solutions below

1
On

Since $\log$ is concave, by Jensen's inequality you have

$$ - \log(n) + lme_n = - \log(n) + \log(\sum_{i=1}^n e^{z_i}) = \log(\frac1n \sum_{i=1}^n e^{z_i}) \ge \frac{1}{n}\sum_{i=1}^n z_i = \mu_n $$ which establishes a relation between mean and lme.

For mean and variance, no general relation exists. You can have zero mean, in this case obviously $\sigma_n^2 > \mu_n$. Conversely, you can have that all $z_i = \mu_n$, in this case $\sigma_n^2 =0$ and the mean can be $\mu_n<0=\sigma_n^2$ or $\mu_n>0=\sigma_n^2$.