Not understanding steps in derivation for entropy of a Gaussian random variable

196 Views Asked by At

Can someone explain the last two steps in the derivation given below?

This is the derivation of the entropy of a Gaussian random variable:

enter image description here

1

There are 1 best solutions below

0
On BEST ANSWER

Recall that the expectation of a constant is equal to the constant it self, and the expectation is linear operator so $$ -E[-\frac{(x-\mu)^2}{ 2\sigma^2}-\ln{\sqrt{2 \pi \sigma^2}}]=\frac{1}{ 2\sigma^2}E[(x-\mu)^2] + \frac{1}{2} \ln{{2 \pi \sigma^2}}$$ Recall also that $\ln(a^b)=b \ln(a)$ for $a$ non negative, $\ln(e)=1$, and $\ln(a)+\ln(b)=\ln(ab)$. Moreover, $E[(x-\mu)^2]=\sigma^2 $, then $$\frac{1}{ 2\sigma^2}E[(x-\mu)^2] + \frac{1}{2} \ln{{2 \pi \sigma^2}}= \frac{1}{ 2\sigma^2}\sigma^2 + \frac{1}{2} \ln{{2 \pi \sigma^2}}= \frac{1}{ 2}+\frac{1}{2} \ln{{2 \pi \sigma^2}}= \frac{1}{ 2}\ln(e)+\frac{1}{2} \ln{{2 \pi \sigma^2}} = \frac{1}{ 2}\ln{{2e \pi \sigma^2}}$$