Supremum characterisation of entropy

78 Views Asked by At

Why is it true that for $ g: \mathbb{R}^{n} \rightarrow \mathbb{R}_{+} $ and any measure $\mu$ $$ \int_{}^{}g \cdot \log(g) d\mu - \int_{}^{}g d\mu \cdot \log(\int_{}^{}g d\mu) = sup_{f: \int_{}^{}e^{f}d\mu \le 1}\int_{}^{}f \cdot g d\mu$$

Any references, hints greatly welcomed

1

There are 1 best solutions below

1
On BEST ANSWER

The objective function is concave and the constraint defines a convex set, so we can use Lagrange's method.

Let $$ L\triangleq \int fgd\mu+\lambda\left(1-\int e^fd\mu\right). $$ Let further $e^f=:h$, or $f=\log h$. Then $$ L=\int (\log h)gd\mu+\lambda\left(1-\int hd\mu\right). $$ Suppose $h$ is optimal, and consider perturbing it infinitesimally by $\delta h$. Then $$ \delta L= \int \frac{1}{h}\delta hgd\mu-\lambda\int \delta hd\mu = \int\left(\frac{g}{h}-\lambda\right)\delta hd\mu. $$ The optimality assumption requires $\frac{g}{h}\equiv\lambda$, or $h=\lambda^{-1}g$. Since $g$ is positive, the constraint must be binding, so $\int hd\mu=1$, or $\lambda=\int gd\mu$. Finally, \begin{align} \sup\int fgd\mu & =\int(\log(\lambda^{-1}g))gd\mu\\ & = \int(\log g)gd\mu-\log\left(\int gd\mu\right)\int gd\mu. \end{align}