Relation between formulas for entropy

95 Views Asked by At

In Ramon Van Handel's notes for high dimensional probability http://www.princeton.edu/~rvan/APC550.pdf he introduces the notion of entropy of a nonnegative random variable $Z$ as

$$Ent[Z] = E[Z \log Z] - E[Z] \log E [Z].$$

My question is, what is the relation of this quantity to Shannon's differential entropy (if any)?

1

There are 1 best solutions below

0
On

I don't see any relation (and to looks strange to me that someone overloads the word "entropy", without motivation - but perhaps I'm missing something)

For one thing, this "Entropy" is not translation invariant (but this was to be expected, given that it's defined for a nonnegative rv).

Further, the scaling property of the Shannon differential entropy: $h(aX)=h(X)+\log a$, doesn't apply: $Ent(aZ) = a Ent(Z)$