Assume you have $k$ information sources $S_1 \ldots S_k$. Each one can transmit $n$ symbols, and all the symbols are different between sources.
Let $S$ be a new source capable of transmiting $nk$ symbols. It does so by randomly choosing, in an uniform way, $S_i \in {S_1 \ldots S_k}$ to do the transmission.
- What's the value of the entropy of $S$, $H(S)$?
If $S$ transmits the $j$th symbol by $S_i$, we say that $S=(i,j)$ and $S_i = j$. $H(S)$ expands to $$-\sum_{i=1}^k \sum_{j=1}^n \Pr[S=(i,j)]\log \Pr[S=(i,j)]$$ This expands to $$-\sum_{i=1}^k \sum_{j=1}^n \frac1k\Pr[S_i=j]\log \left( \frac1k\Pr[S_i=j]\right)$$ which equals $$\begin{align}&-\sum_{i=1}^k \sum_{j=1}^n \frac1k\Pr[S_i=j]\log\frac1k - \sum_{i=1}^k \sum_{j=1}^n \frac1k\Pr[S_i=j]\log \left( \Pr[S_i=j)]\right)\\&=\log k - \frac1k \sum_{i=1}^k \sum_{j=1}^n \Pr[S_i=j]\log \left( \Pr[S_i=j)]\right)\\&=\log k+\frac1k\sum_{i=1}^k H(S_i) \end{align}$$
[edit]
Initially I wrote the final expression as $\log \frac 1k+\frac1k\sum_{i=1}^k H(S_i)$. That's because I forgot to add the minus sign to the expression of the entropy.