What is the difference of entropy when used with discrete and continuous random variable X?

341 Views Asked by At

Entropy for a continuous random variable that has Gaussian distribution is defined as:

$H(X)=-\int p(x)lnp(x)dx$

after some calculations it becomes:

$H(X)=\frac{1}{2}(ln(2\pi \sigma^{2})+1)$

After this book says:

The last inequality is according to the variance of a standard normal distribution $\sigma^2=\int p(x)(x-u)^{2}dx$

Note that unlike the entropy for discrete variable which is always non-negative when $\sigma^{2}<\frac{1}{2\pi e}$, $H(X)<0$

And claims that one property which holds for the entropy of any discrete random variable but doesn't hold here is indicated here.

I understood every calculation and formulas that are derived. But I can not see what is that property is.

1

There are 1 best solutions below

1
On BEST ANSWER

You have to integrate two things. The formula you found for $H(X)$ at the end, and the formula you found for variance.

As stated at your book, when you insert $\sigma^{2}< \frac{1}{2 \pi e}$ the $ln(2 \pi \sigma^2)$ inside your latest entropy function can become less than 1. So your function becomes negative. It seems like this is what you missed here.