Let $X_1$ and $X_2$ be discrete random variables drawn according to probability mass function $p_1$ and $p_2$ over the respective alhabets $X_1={1,2,...m}$ and $X_2={m+1,...,n}$. Let $X=X_1$ with probability $a$ and $X=X_2$ with probability $(1-a)$. Quesiton: Find $H(X)$ in terms of $H(X_1)$ and $H(X_2)$ and $a$.
This problem is from Elements of Information Theory by Thomas M. Cover, Joy A. Thomas.
How to solve the problem ? Actually I am more confused why the answer is not simply $H(X)=-a*log(a) - (1-a)*log(1-a)$.
(The correct answer is $H(X)=-a*log(a) - (1-a)*log(1-a)+aH(X_1)+(1-a)H(X_2)$ according to the solution manual)
Thanks.
You have $P(X=i)=ap_1(i)$ for $i=1,\ldots,m$ so the contribution to the total entropy from the $X_1$ part equals $$-\sum_i ap_1(i)\log(ap_1(i)) = -a\log{a}\sum_i p_1(i) - a\sum_i p_1(i)\log{p_1(i)} = -a\log{a}+aH(X_1)$$ and then there is a similar term with $(1-a)$ and $X_2$.