Entropy of discrete and continuous uniform distributions

381 Views Asked by At

Despite a similar post here, I read that the entropy of a uniformly distributed discrete random variable is always log base $2$ of the number of observations in the dataset, $H(X) = \log(N)$. Is this also true for a uniformly distributed continuous r.v.?

1

There are 1 best solutions below

2
On

Think about it.

A real number between $0$ and $1$ is able to contain infinite information. Just encode any symbol of the alphabet with the decimal representation 3 digit ascii symbols for example.

Entropy of continous distribution means how much information you need to store just for the shape of the distribution. You will need to add as many extra bits for whatever precision you will want.