Entropy reduction

81 Views Asked by At

May be this is an elementary question in information theory.

Given N random integers $x_1, x_2, \dots , x_N$ between 1 and m, what's the reduction in entropy if all integers are distinct?

1

There are 1 best solutions below

0
On BEST ANSWER

Assuming the integers are independent and uniform on $[1,m]$, then we have a set with $m^N$ equiprobable events, so the joint entropy is $H_0 = N \log m $

If we restrict to distinct integers, we have $ m!/(m-N)!$ events, so the restricted entropy is $ H_1 = \log m! - \log(m-N)!$

The reduction in entropy is

$$ \Delta = H_0 - H_1 = N \log m - \log m! + \log(m-N)!$$

For $ m \gg N \gg 1$, using Stirling (and natural logarithm, hence entropy in nats) the first order approximation is

$$ \Delta \approx \frac{N(N-1)}{2m}=\frac{\binom{N}{2}}{m} $$