Entropy of floating number array

1.8k Views Asked by At

I am familiar with shanon's definition of entropy. $$ H(P) = - \sum_{i=1}^n p_i \cdot \log_2(\mathcal p_i) $$

I am today in the situation that I'd like to compute an entropy like function but for a set of (value/probability) where the value is a floating number.

The object i'm considering are point by point ratios betwen two discretized density fields which are basicaly functions $f: [a,b] \mapsto \mathbb R^+$.

The fact is there are a lot a floating point values ... a huge lot ... and computing the probability for each floating point value just doesn't sound right

Is there any standard methode to compute entropy/energy for real fields ?

1

There are 1 best solutions below

1
On

The first part of your question is a little confusing. $p_i$ cannot be an integer except 0 or 1. It's a probability and thus it's restricted to $0\leq p_i\leq 1$. The probabilities it represents are discrete.

Regarding the other part of your question, the entropy is completely independent of the values that the function takes (the "alphabet" in information theory terminology).

The fact is there are a lot a floating point values ... a huge lot ... and computing the probability for each floating point value just doesn't sound right

I don't understand what you mean here, but if there are a finite number of values then you can use the discrete entropy. However, if the function is continuous (has an uncountable infinite number of values), then you need to invoke a continuous version of the entropy function.