Entropy of the union of two sources

508 Views Asked by At

I am given that $S_1=(S_1,P_1)$ and $S_2=(S_2,P_2)$ are sources, where $S_1=\{x_1,...,x_n\}$, $P_1(x_i)=p_i$ and $S_2=\{y_1,...,s_m\}$, $P_2(y_j)=q_j$.

I have to find the entropy of $S_{\lambda}=(S_i \cup S_2, P)$, where $0 \le \lambda \le 1$, $P(x_i)=\lambda p_i$, and $P(y_j)=(1-\lambda) q_j$.

I was unable to find something that I could understand on the process for finding the entropy of the union of two sources.

I am thinking that it is $H(S_{\lambda})=H(S_1 \cup S_2+S_2 \cup S_2)=H(S_1 + S_2+S_2 + S_2)=\sum_{i=1}^n \lambda p_i log(\frac{1}{\lambda p_i})+3(\sum_{j=1}^m (1-\lambda) q_j log(\frac{1}{(1-\lambda)q_j}))$

But I haven't seen this before and wanted to make sure. Also any hint on how to proceed would be greatly appreciated, thank you.

1

There are 1 best solutions below

0
On

In this problem the union $S_{\lambda}=(S_i \cup S_2, P)$ simply amounts to having a new source that has a set of symbols given by the union $S_1 \cup S_2$ (so, if the two individual source had $2$ and $3$ symbols, the new source will have 5 symbols). The problem statement is slightly unclear in whether the symbols are all distinct ($x_i \ne y_j$), let's assume that.

This is also known as a "(disjoint) mixture", see eg this problem from Cover & Thomas texbook:

enter image description here

Once you understand this, this should not be difficult to solve. An elegant strategy is to introduce an auxiliar random variable $Z$ that indicates whether the first or the second component was selected. See here for the more general problem in which we have $n$ (instead of just 2) components.