Does $\operatorname{var}[X]<\infty$ imply $h(X)< \infty$, $h$ the entropy?

58 Views Asked by At

Let $X$ be a real-valued random variable, with a density $p$ with respect to the Lebesgue measure. The differential entropy is defined as

$$ h(X) = -\int \ln p(x) p(x) dx$$

given the right side exists. The variance of $X$ is defined as $$\operatorname{var}[X] = \int (x-\mathbb{E}[X])^2 p(x) dx,$$ again given that the right side exists.

In these lecture notes I found online it is stated without proof that $$ \operatorname{var}[X] < \infty \implies h(X) < \infty.$$

I could not find a reference and I do not immediately see how this could work. What I could imagine is that they meant $p \in \mathcal{L}^2$, which would then by Jensen's inequality give $-h(X) \leq \ln \int p^2(x) dx <\infty.$

Is there a proof for the claimed property?

1

There are 1 best solutions below

0
On BEST ANSWER

Suppose $X$ has finite variance. Then since the integral converges,

$$(x-\mathsf{E} X)^2p(x)\underset{x\to\pm\infty}{\longrightarrow} 0.$$

Recall that $\mathsf{E} X$ is some (finite) number, so $p(x)=o(1/x^2)$ when $x\to\pm\infty$. It follows that

$$p(x)\ln(1/p(x))=o(\log x/x^2).$$

So the integral converges by a comparison test. As you can see, it is somewhat far to being a necessary condition as well (we only need $p(x)$ to decay slightly faster than $1/x$).