Differential entropy of one variable is $$h(x_i) = -\int_{-\infty}^{\infty} p(x) \ln p(x) dx,$$
For the entropies of multiple variables $\mathbf{X} = \{x_1, x_2, \dots, x_K\}$, we can place them together in a vector $$\mathbf{h}(\mathbf{X}) = \begin{pmatrix} h(x_1) \\ h(x_2) \\ \vdots \\ h(x_K) \end{pmatrix}$$
If we want to weight the entropies in the vector element by element as a summation, a weight vector is pre-multiplied
$$\begin{pmatrix} w_1 \enspace w_2 \enspace \dots \enspace w_K \end{pmatrix} \begin{pmatrix} h(x_1) \\ h(x_2) \\ \vdots \\ h(x_K) \end{pmatrix} $$
Above shows the weighted entropy vector operation in matrix notation. How can I instead write this as one integral? Is the following correct?
$$h(x_i) = -\int_{-\infty}^{\infty} w_i \cdot p(x_i) \ln p(x_i) dx_i$$
If so, this where I'm stuck:
- should I add a second subscript to the $p_i$'s since we are integrating them observation by observation, whereas $w_i$ is left as is because we are integrating weights variable-by-variable?
- should there instead be a double integral because of this?
- should I leave the $i$th weight outside of the integral because it's not stochastic (random) like everything else inside of the integral?
(edited $H$ to $h$ because the first represents discrete entropy and the second is for differential entropy)