This question is about when a discrete distribution has finite entropy because it has finite moments.
Let $X$ be a discrete random variable (which can be positive, negative, and/or zero unless stated otherwise), and let— $$H(X) = \sum_n -\mathbb{P}[X=n] \ln(\mathbb{P}[X=n]),$$ be the Shannon entropy of $X$.
It is known that:
- If $X$ is integer-valued and has a finite variance, then $H(X)$ is finite (Massey 1988).
- If $X$ is integer-valued, is 0 or greater, and has a finite mean, then $H(X)$ is finite (Rioul 2022).
It is also known that a discrete distribution can have infinite Shannon entropy even if it's positive-valued (one example is some members of the zeta Dirichlet distribution); see also Devroye and Gravel 2020.
However, I believe that this is so because the infinite-entropy zeta Dirichlet distributions (I think) have an infinite $z$th moment for any real $z>0$.
Moreover, I believe that the Cauchy distribution has a finite entropy even though its mean is infinite because it does have a finite $z$th moment for some $z$ in $(0, 1)$, in fact for every $z$ in $(0, 1)$.
However, several questions remain on the relationship between finite entropy and finite moments.
Questions
- If $X$ is supported on the whole set of integers and is such that $\mathbb{E}[X^2]$ is infinite but $\mathbb{E}[X^z]$ is finite for some $z$ in $[1, 2)$, then is $H(X)$ finite? If not, under what additional conditions is $H(X)$ finite?
- If $X$ is supported on the integers or some subset thereof, such that $\mathbb{E}[X]$ is infinite but $\mathbb{E}[X^z]$ is finite for every $z$ in $(0, 1)$, then is $H(X)$ finite? If not, under what additional conditions is $H(X)$ finite?
- If $X$ is supported on the integers or some subset thereof, such that $\mathbb{E}[X]$ is infinite but $\mathbb{E}[X^z]$ is finite for some $z$ in $(0, 1)$, then is $H(X)$ finite? If not, under what additional conditions is $H(X)$ finite?
- If $X$ is supported on the integers or some subset thereof, such that $\mathbb{E}[X^z]$ is infinite for every real $z > 0$, then is $H(X)$ infinite?
Motivation
My motivation is to characterize the discrete distributions with finite entropy, since only finite-entropy distributions can be sampled in finite time on average (Knuth and Yao 1976).
References
O. Rioul, "Variations on a Theme by Massey," in IEEE Transactions on Information Theory, doi: 10.1109/TIT.2022.3141264.
Massey, J.L., "On the entropy of integer-valued random variables", 1988.
Devroye, L., Gravel, C., "Random variate generation using only finitely many unbiased, independently and identically distributed random bits", arXiv:1502.02539v6 [cs.IT], 2020.
Knuth, Donald E. and Andrew Chi-Chih Yao. "The complexity of nonuniform random number generation", in Algorithms and Complexity: New Directions and Recent Results, 1976.
I assume that $X$ takes non-negative integral values.
First, suppose that $E X^a<\infty$ for some $a>0$. Then for all $n\ge 0$, it holds either that $$ p_n\log \frac{1}{p_n}\le n^a p_n $$ or $$ p_n\le e^{-n^a}. $$ But in the latter case, we can also bound, say, $\log(1/p_n)\le \sqrt{p_n}$. So both upper bounds are summable, and we get $H(X)<\infty$.
Reciprocally, if we know that $E X^a=\infty$ for all $a>0$, there is not much we can say about $H(X)$. Consider for example $p_n=1/m^2$ if $n=e^m$ for some $m\ge 0$ and $p_n=0$ otherwise. You can check that $p_n$ satisfies the all-infinite-moments condition (in fact, the main term of the sum does not even converge to $0$), but $$ \sum_{n\le e^m} p_n\log(1/p_n)=\sum_{k\le m} \frac{\log(k^2)}{k^2}. $$ So $H(X)<\infty$.