How to formulate and simplify weighted differential entropy for Gaussian random variables?

393 Views Asked by At

Differential entropy of a single random variable $X$ is

$$h(X) = -\int f(x_i) \ln f(x_i) \hspace{1mm} dx_i$$ The weighted differential entropy of $N$ different variables, each with their own differential entropy and uniquely assigned weight, contained in vectors $\boldsymbol w$ and $\boldsymbol h$, is the weighted sum $$\boldsymbol w ^\top \boldsymbol{h} =-\Sigma_{i=1}^N w_i \int f(x_i) \ln f(x_i) \hspace{1mm} dx_i$$

In the case of a single Gaussian variable, differential entropy has a closed-form analytical solution, being a function of its variance, $\sigma_i^2$: $$h = \frac{1}{2} \ln\left(2\pi e\sigma_i^2\right)$$

What then is a simplified form of weighted Gaussian entropy? Here is a start to my guess, which I am asking help to simplify, but at the same time open to analytical solutions for that compactly express the same quantity using multivariate input instead, i.e. the covariance matrix rather than individual $\sigma_i$'s. \begin{align}&= \Sigma_{i=1}^N \left\{ w_i \times \frac{1}{2} \ln\left(2\pi e\sigma_i^2\right)\right\}_{i=1}^N\\ &= ?\end{align}