Log base change problem, Multivariate Gaussian differential entropy proof

368 Views Asked by At

I am working through a proof in this document

http://ee.tamu.edu/~georghiades/courses/ftp647/Chapter7.pdf

for Theorem 3 (The entropy of a multivariate Gaussian distribution):

Let X = (X1, X2, · · · , Xn) be jointly Gaussian distributed with mean µ and covariance matrix K(N(µ, K)). Then,

$$ h(X) = \frac{1}{2}log(2\pi e)^n|K| $$

The only doubt I have with the proof is the beginning:

$$ h(X) = -E[ln f(x)] $$

In the beginning of the document (Definition 1):

The differential entropy of a continuous random variable X, denoted by h(X), is defined as

$$ h(X) = -\int_S f(x) log f(x) dx\ = E[-log f(X)] $$

I understand that by using the property of logs:

$$ log_e (x) = \frac{log(x)}{log(e)} $$

and therefore

$$ log(x) = log_e(x) log(e) $$

substituting in h(X):

$$ h(X) = -log(e) \int_S f(x) ln f(x) dx\ = -log(e) E[ln f(X)] $$

Where then is the missing log(e) in the beginning of the proof???

Thanks in advance!

1

There are 1 best solutions below

0
On

In $h(X) = -E[lnf(x)]$ the entropy is in nats, the units of entropy when you work with natural logarithm.

In $h(X) = -E[logf(x)]$ the entropy is in bits, the units of entropy when you work with base 2 logarithm.

$1 $nat = log$_2(e)$ bits, so...