Entropy maximization in the normal distribution

205 Views Asked by At

I am trying to derive the entropy in normal distribution. Let $p(x)$ to be the probability density function of uniform normal distribution \begin{equation} p(x) = \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{x^2}{2\sigma^2}} \end{equation} hence, by using integration by parts, we have \begin{equation} \int^{\infty}_{-\infty} x^2p(x) dx = x^2 \int^{\infty}_{-\infty} p(x) dx - \int^{\infty}_{-\infty} 2x \left(\int^{\infty}_{-\infty} p(x) dx\right) dx \end{equation} Because \begin{equation} \int^{\infty}_{-\infty} p(x) dx = 100\% \end{equation} we have \begin{equation} \int^{\infty}_{-\infty} x^2p(x) dx = x^2 - x^2 + C = C \end{equation} However, lots of relevant proofs online says that \begin{equation} \int^{\infty}_{-\infty} x^2p(x) dx = \sigma^2 \end{equation} Does anyone know the reason?

1

There are 1 best solutions below

3
On BEST ANSWER

By definition, if $X$ has density $p(x)$ then $EX^2=\int_{-\infty}^\infty x^2p(x)\,dx$. So here we have $$\int_{-\infty}^\infty x^2p(x)\,dx=EX^2=Var(X)+(EX)^2=Var(X)=\sigma^2.$$

If you really need to use integration method, here is one. \begin{align*} \int_{-\infty}^\infty x^2p(x)\,dx&=\frac1{\sqrt{2\pi}\sigma}\int_{-\infty}^\infty x^2e^{-\frac{x^2}{2\sigma^2}}\,dx\\ &=\frac1{\sqrt{2\pi}\sigma}\int_{-\infty}^\infty -\sigma^2x\,d\left(e^{-\frac{x^2}{2\sigma^2}}\right)\\ &=\frac1{\sqrt{2\pi}\sigma}\left(-\sigma^2xe^{-\frac{x^2}{2\sigma^2}}\mid_{-\infty}^\infty+\sigma^2 \int_{-\infty}^\infty e^{-\frac{x^2}{2\sigma^2}}\,dx\right)\\ &=\sigma^2 \int_{-\infty}^\infty \frac1{\sqrt{2\pi}\sigma}e^{-\frac{x^2}{2\sigma^2}}\,dx\\ &=\sigma^2 \int_{-\infty}^\infty p(x)\,dx\\ &=\sigma^2. \end{align*}

Finally, the entropy. \begin{align*} H&=-\int_{-\infty}^\infty p(x)\log p(x)\,dx\\ &=\frac1{2\sigma^2}\int_{-\infty}^\infty x^2p(x)\,dx+\log(\sqrt{2\pi}\sigma)\int_{-\infty}^\infty p(x)\,dx\\ &=\frac12+\log(\sqrt{2\pi}\sigma)\\ &=\log(\sqrt{2\pi e}\sigma). \end{align*}