Minimum entropy of a discrete random variable - find the appropriate distributions.

770 Views Asked by At

I am asked to determine what the minimum entropy of a discrete random variable might be.

I have a hunch that the result will be zero, given that $$H(X) = \sum_i p(x_i)\log(\frac{1}{p{x_i}})$$

Let's say that our random variable takes in $n$ values. We can easily zero out $n-1$ terms of the sum by setting their probabilities to zero. Since they need to add up to $1$, there is one event with the probability of $1$, and it will contribute $1 \log (1) = 0$ to the overall entropy.

So, the minimum entropy is $0$ and there are exactly $n$ distributions to achieve that: We choose one element and give it the probability of $1$ ($n$ ways to do that) and the rest are zero.

Is it the correct solution?

2

There are 2 best solutions below

0
On

Yes, if a variable evaluates to one of the values $\{0, 1, 2, ..., n\}$ but always evaluates to the number, e.g., 2, then there is no uncertainty about the outcome, hence the entropy is 0

0
On

I'd say yes. From the point of information theory, the entropy tells you how many bits you need in average to encode a message of the outcome of an event, where each outcome has a different probability. So if there is one outcome which will always occur (with probability 1) then you need zero bits, because there is no need send a message as the outcome is predeteminated.