Question regarding the Entropy of a probability mass function

1.6k Views Asked by At

I assume that the entropy, $E$, of a probability mass function (pmf), $p(X)$, of a discrete random variable, $X$, is computed as:

$$\begin{align}\mathbb{E}(p(X)) &= -p(X = x_1)\log[p(X = x_1)]-p(X = x_2)\log[p(X = x_2)]-p(X = x_3)\log[p(X = x_3)] \\ &= - \sum_{i=1}^3 p(X=x_i)\log[p(X=x_i)] \end{align}$$

(I assume that $X$ takes values in the set $\{x_1,x_2,x_3\}$).

Suppose I have two candidate pmf's of X, denoted as $p_1(X)=[0.5,0.2,0.3]$ and $p_2(X)=[0.2, 0.3,0.5]$. Clearly, both these pmf's have the same entropy, since their constituent probabilities are the same.

My question: Does there exist any pmf, say $p_3(X)$, whose constituent probabilities are not the same as that of $p_1(X)$, but which has the same entropy as that of $p_1(X)$?

2

There are 2 best solutions below

1
On BEST ANSWER

There are such pmf coincidences, applicable for any set of candidates with more than 2 candidate values other than the maximal entropy all-value-equal entropy.

For your example, consider (although there is an entire 1-parameter family of iso-entropic distributions) the distrbution with $$ p(X) = [0.24301892,0.24301892,0.51396216] $$ This has the identical entropy (to 8 decimal places), to $[0.2,0.3,0.5]$. You can prove these "coincidences" do occur (exactly) using the mean value theorem.

0
On

You are simply looking for the solutions of the implicit equation:

$x \log (x)+y\log (y)+(1-x-y)\log (1-x-y) = 0.2\log (0.2)+0.3\log (0.3)+ 0.5 \log (0.5)$

For example, you can try a plot here

enter image description here