A question about the convexity of entropy

665 Views Asked by At

Nowadays I refer to some references about entropy.They all say "The entropy function $H(X)$ is a concave function".The definition is as follows: Let $X$ be a continuous random variable with probability density function (pdf) $f (x)$ (in short $X ∼ f (x)$). The entropy of $X$ is defined as $$h(X) = −\int f (x) \log f (x) \mathrm{d}x = −E_X(\log f (X)).$$ My question is that the log function is concave , so minus of the log function is convex. So why is the entropy of $X$ concave?

1

There are 1 best solutions below

2
On

Consider the simple case of the binary entropy $$ H(X) = \operatorname H_\text{b}(p) = -p \log_2 (p) - (1 - p) \log_2 (1 - p)$$ $$\frac{ d^2 H_\text{b}(p)}{dp^2}=-\frac{1}{(1-p)\, p\, \log (2)} <0$$