Entropy of a mixture given its joint distribution

198 Views Asked by At

Given the joint probability distribution of a set of non-negative random variables $y_1,\dots y_N$, is there an analytic way to calculate the probability distribution of the mixture entropy $S$, defined as $$ S=-\sum_{i=1}^N \frac{y_i}{Y}\,\log \frac{y_i}{Y} $$ where $Y=\sum_{i=1}^N y_i$? If this is not possible in general, is it possible when $y_1,\dots y_N$ are i.i.d.? Or perhaps when they are negative binomial in distribution, $y_i\sim\text{NBin}(r_i,p)$ with equal $p$?