Joint Entropy closed-form analytical solution

109 Views Asked by At

Differential entropy of a single Gaussian random variable is

$$H(X) = \frac{1}{2} \ln (2\pi e \sigma^2)$$

What then is the closed-form analytical solution for joint entropy, $H(X,Y)$?

1

There are 1 best solutions below

3
On BEST ANSWER

Let $(X, Y) \sim \mathcal{N}(0, K),$ where $$ K=\left[\begin{array}{cc} \sigma^{2} & \rho \sigma^{2} \\ \rho \sigma^{2} & \sigma^{2} \end{array}\right] $$ Then differential entropy is $$h(X)=h(Y)=\frac{1}{2} \log (2 \pi e) \sigma^{2}$$ and joint entropy is

\begin{align} h(X, Y)&=\frac{1}{2} \log (2 \pi e)^{2}|K|\\ &= \frac{1}{2} \log (2 \pi e)^{2} \sigma^{4}\left(1-\rho^{2}\right) \end{align}