What are the boundaries of the entropy of a gaussian random variable? What is their meaning?

27 Views Asked by At

Given a gaussian random variable $X\sim \mathcal N(\mu,\sigma^2)$ with p.d.f.: $$ f(x)=\frac{1}{\sqrt{2\pi\sigma^2}}\cdot e^{-\frac{(x-\mu)^2}{2\sigma^2}} $$

Then we can calculate its entropy (expected value of information) as: $$ H(X)=-\int{f(x)log_a{f(x)}dx}\\ =-\int{f(x)log_a\left(\frac{1}{\sqrt{2\pi\sigma^2}}\cdot e^{-\frac{(x-\mu)^2}{2\sigma^2}}\right)dx}\\ =-\int{f(x)\left(log_a(2\pi\sigma^2)^{-1/2}-\frac{(x-\mu)^2}{2\sigma^2}log_ae\right)dx}\\ =\frac{1}{2}log_a(2\pi\sigma^2)\underbrace{\int{f(x)dx}}_{=1}+\frac{1}{2\sigma^2}log_ae\underbrace{\int{(x-\mu)^2f(x)}}_{=E[(X-\mu)^2]=\sigma^2}\\ =\frac{1}{2}log_a(2\pi e\sigma^2) $$

Given the relationship of $H(X)$ with $\sigma^2$ I was told that, as intuition suggests, when the variance decreases and $f(x)$ becomes a bell tighter around its mean $\mu$, uncertainty about the value of $X$ becomes less, thus also entropy reduces.

However, I don't fully understand this consideration at the boundaries: why does it look like entropy $H(X)=0$ when $\sigma^2=1/2\pi e$? And why below this limit it looks like entropy increases in negative sign even if the variance is decreasing towards 0? If these are correct results, what is their meaning?