Entropy and Expectation

344 Views Asked by At

Suppose that the Probability Mass Function of a random variable $X$ with values in $A = \{1, 2, \dots\}$ has nonincreasing probabilities, $P(k + 1) \leq P(k)$, for all $k \geq 1$. Show that, if $H(X) < \infty$, then $\mathbb{E}[\log X] < \infty$.

I can intuitively see that this is true since $-\log$ grows very quickly near the origin, much faster than linear. So, if $H(X) = -\sum_{x=1}^\infty p(x)\log p(x)$ is finite, then $p(x)$ must be shrinking quickly enough to modulate this and keep the sum finite. Hence, it decreases quickly enough to keep $\mathbb{E}[\log X] = \sum_{x=1}^\infty p(x)\log x$ finite as well. I can't seem to figure how to put this concept into rigorous math.

2

There are 2 best solutions below

2
On BEST ANSWER

I think you can apply a comparison test between two series.

Since $p(n)$ is summable and montonically decreasing, we know that $p(n)<\frac{1}{n}$ for $n$ large, and this implies that $-\log(p(n))>\log(n)$. Hence, $-log(p(n))p(n)\geq \log(n)p(n)$ for $n$ large enough.

0
On

This solution is based on the ideas of @Keen-ameteur.

Since the sequence $(p_k)$ is decreasing, we have $p_1 \geq \ldots \geq p_k$ for all $k.$ If, for some $k$ we had that $p_k > \dfrac{1}{k},$ then we would reach that $p_1 + \ldots + p_k > k \dfrac{1}{k} = 1,$ which is impossible. Therefore, for all $k$ we have $p_k \leq \dfrac{1}{k}.$

The result now follows easily since then $-p_k \log p_k \geq p_k \log k.$ Q.E.D.