prove $ \sum_{i=1}^{n}= I(p_i) \ge n \log n$ self entropy

238 Views Asked by At

If ${p_1,p_2,...p_n}$ is a probability distribution and $I(p_i)$ is self information entropy for each probability $p_i$, prove that $$ \sum_{i=1}^{n}I(p_i) \ge n \log n.$$

I can use Jensen inequality since this is a concave up function so
for $ f(X) = n \log n$
and I know that $I(p)=-\ln (p)$
$-\sum_{i=1}^{N}\log_2 p_i \geq n \log n$
$\sum_{i=1}^{N}p_i\log_2 \frac{1}{p_i} \le \log \sum \frac{1}{p}p$
this same as entropy(?)
$\sum_{i=1}^{N}p_i\log_2 \frac{1}{p_i} \le \log \sum_{i=1}^{N} \frac{1}{p}p$
$\sum_{i=1}^{N}p_i\log_2 \frac{1}{p_i} \le \log n$

I'm not sure how can I get the $n \log n$?

2

There are 2 best solutions below

4
On

You are defining $I(p_i) = -\ln p_i$ and thus seek to prove that $$ \sum_{i=1}^n -\ln p_i \ge n \ln n. $$

Note that $\sum_{k=1}^n p_k = 1$ therefore by AM-GM, $$ \sqrt[n]{\prod_{k=1}^n p_k} \le \frac{1}{n} \sum_{k=1}^n p_i = \frac{1}{n}, $$ so it follows that $$ \prod_{k=1}^n \frac{1}{p_k} \ge n^n, $$ and taking natural logarithms of both sides should finish the job.

5
On

If you have $f(x)=\log x$, which is concave down, you have $$f\left(\frac{\sum_{i=1}^n p_i}{n}\right)\geq \frac{\sum_{i=1}^n f(p_i)}{n}$$ or $$-\log n\geq \frac{\sum_{i=1}^{n} -I(p_i)}{n}$$which is the same as $$\sum_{i=1}^{n} I(p_i)\geq n\log n$$