Suppose that the Probability Mass Function of a random variable $X$ with values in $A = \{1, 2, \dots\}$ has nonincreasing probabilities, $P(k + 1) \leq P(k)$, for all $k \geq 1$. Show that, if $H(X) < \infty$, then $\mathbb{E}[\log X] < \infty$.
I can intuitively see that this is true since $-\log$ grows very quickly near the origin, much faster than linear. So, if $H(X) = -\sum_{x=1}^\infty p(x)\log p(x)$ is finite, then $p(x)$ must be shrinking quickly enough to modulate this and keep the sum finite. Hence, it decreases quickly enough to keep $\mathbb{E}[\log X] = \sum_{x=1}^\infty p(x)\log x$ finite as well. I can't seem to figure how to put this concept into rigorous math.
I think you can apply a comparison test between two series.
Since $p(n)$ is summable and montonically decreasing, we know that $p(n)<\frac{1}{n}$ for $n$ large, and this implies that $-\log(p(n))>\log(n)$. Hence, $-log(p(n))p(n)\geq \log(n)p(n)$ for $n$ large enough.