Does the Shannon Entropy always exist (even for infinite distributions)?

464 Views Asked by At

Let $p : \mathbb{N} \to [0, 1]$ be a probability distribution over the naturals.

The Shannon Entropy is:

$$H = -\sum_{n=0}^\infty p(n)\log_2 p(n)$$

Does this series always converge?


I tried a little to attack this problem but it's been some time since I evaluate series. My attempt:

We known that $\sum_{n=0}^\infty p(n) = 1$, therefore it must be the case that $\lim\limits_{n\to\infty} p(n) = 0$, and since $\lim\limits_{x\to 0} x\log_2 x = 0$, for any $\varepsilon > 0$ there exists only a finite amount of terms greater than $\varepsilon$. The problem is that I can't conclude anything from this, because there are several series in which the terms go to zero but it still diverges.

2

There are 2 best solutions below

4
On

This series does not always converge. First of all, note that the following series converges: $$\sum\limits_{i = 1}^\infty \lceil 2^i/i^2\rceil \cdot 2^{-i}.$$ This is because $\lceil 2^i/i^2\rceil \le 2^i/i^2 + 1 = O(2^{i}/i^2)$ as $i\to \infty$. Hence there exists a constant $c > 0$ such that $$\sum\limits_{i = 1}^\infty \lceil 2^i/i^2\rceil \cdot c2^{-i} = 1.$$

So, consider a probablity distribution over $\mathbb{N}$ in which for every $i\ge 1$ there are exactly $\lceil 2^i/i^2\rceil$ probablities which are equal $c2^{-i}$. Shannon entropy diverges for such distribution: \begin{align*} H = \sum\limits_{i\ge 1} \lceil 2^i/i^2 \rceil \cdot c 2^{-i} \log_2(2^i/c) &\ge \sum\limits_{i\ge \max\{1,\, 2\log_2(c)\}} \lceil 2^i/i^2 \rceil \cdot c 2^{-i} \log_2(2^i/c) \\ &\ge \sum\limits_{i\ge \max\{1,\, 2\log_2(c)\}} 2^i/i^2 \cdot c 2^{-i} \log_2(2^i/2^{i/2})\\ &= \sum\limits_{i\ge \max\{1,\, 2\log_2(c)\}} \frac{c}{2i}. \end{align*}

0
On

The answer is no. Define $p$ by

$$ p(n) = \frac{c}{(n+2)\log^2(n+2)}, $$

where $c$ is the normalizing constant. Then it follows that $-p(n)\log p(n) \sim c/n \log n$ as $n\to\infty$ and hence $H(p) = \infty$ by the limit comparison test.