Is there any differentiable function $f$ that approximates the "entropy" of a set of numbers $S$?

213 Views Asked by At

Where entropy is some measure of the degree of randomness/disorder in a given set of numbers: $S = \{a_1, a_2, ..., a_i\}$

For example, the set $S_{high} = \{4,0,2,5,8,3,7,2,5\}$ has a high degree of randomness/disorder.

And the set $S_{low} = \{4,4,4,4,5,5,5,5,5\}$ has a low degree of randomness/disorder.

I am aware of information entropy $IE$, which applies to probability distributions (and quantifies the amount of information, which is related to randomness/disorder, contained in a probability distribution):

$$IE = \sum p_i log(\frac{1}{p_i})$$

However, I simply have numbers. Although I can take these numbers and convert them to an empirical probability distribution as:

$S_{low} = \{4,4,4,4,5,5,5,5,5\} $

$\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\downarrow$

enter image description here

The process/function by which one would do so (convert numbers to an empirical probability distribution so that the $IE$ formula above can be applied) is not differentiable (at least it seems that way to me).

So I wonder, is there any differentiable function that can take a set of raw numbers, and approximate the "entropy" of those numbers in the sense described above?

1

There are 1 best solutions below

1
On

You don't have sets, but strings $S$ of digits. If these digits are not just symbols but represent some numerical values you could do a discrete Fourier transform of $S$, concatenated by its reverse in order to remove unwanted boundary effects. A chaotic behavior of $S$ will be reflected in the "middle" Fourier coefficients being large. The latter phenomenon is the discrete version of the fact that the Fourier coefficients of a periodic analog function $f$ tend to zero with a speed depending on the smoothness of $f$.